Test Report: Hyperkit_macOS 18169

                    
                      248a87e642b5c2a9040ef2ce1129e71918aa65a4:2024-02-13:33129
                    
                

Test fail (6/328)

x
+
TestStartStop/group/default-k8s-diff-port/serial/SecondStart (478.67s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p default-k8s-diff-port-603000 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.28.4
E0213 15:53:39.083776    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/no-preload-355000/client.crt: no such file or directory
E0213 15:53:39.089507    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/no-preload-355000/client.crt: no such file or directory
E0213 15:53:39.099669    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/no-preload-355000/client.crt: no such file or directory
E0213 15:53:39.120976    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/no-preload-355000/client.crt: no such file or directory
E0213 15:53:39.162304    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/no-preload-355000/client.crt: no such file or directory
E0213 15:53:39.242806    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/no-preload-355000/client.crt: no such file or directory
E0213 15:53:39.404047    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/no-preload-355000/client.crt: no such file or directory
E0213 15:53:39.726039    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/no-preload-355000/client.crt: no such file or directory
E0213 15:53:40.367566    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/no-preload-355000/client.crt: no such file or directory
E0213 15:53:41.649174    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/no-preload-355000/client.crt: no such file or directory
E0213 15:53:44.366309    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/no-preload-355000/client.crt: no such file or directory
E0213 15:53:49.486909    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/no-preload-355000/client.crt: no such file or directory
E0213 15:53:50.861370    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/false-599000/client.crt: no such file or directory
E0213 15:53:53.826648    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/old-k8s-version-481000/client.crt: no such file or directory
E0213 15:53:53.832007    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/old-k8s-version-481000/client.crt: no such file or directory
E0213 15:53:53.842688    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/old-k8s-version-481000/client.crt: no such file or directory
E0213 15:53:53.862790    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/old-k8s-version-481000/client.crt: no such file or directory
E0213 15:53:53.904217    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/old-k8s-version-481000/client.crt: no such file or directory
E0213 15:53:53.984514    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/old-k8s-version-481000/client.crt: no such file or directory
E0213 15:53:54.144963    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/old-k8s-version-481000/client.crt: no such file or directory
E0213 15:53:54.466062    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/old-k8s-version-481000/client.crt: no such file or directory
E0213 15:53:55.107295    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/old-k8s-version-481000/client.crt: no such file or directory
E0213 15:53:56.387490    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/old-k8s-version-481000/client.crt: no such file or directory
E0213 15:53:58.947807    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/old-k8s-version-481000/client.crt: no such file or directory
E0213 15:53:59.729067    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/no-preload-355000/client.crt: no such file or directory
E0213 15:54:01.425172    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/enable-default-cni-599000/client.crt: no such file or directory
E0213 15:54:04.004350    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
E0213 15:54:04.068291    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/old-k8s-version-481000/client.crt: no such file or directory
E0213 15:54:05.720302    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
E0213 15:54:14.309238    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/old-k8s-version-481000/client.crt: no such file or directory
E0213 15:54:20.211065    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/no-preload-355000/client.crt: no such file or directory
E0213 15:54:34.789960    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/old-k8s-version-481000/client.crt: no such file or directory
E0213 15:54:35.951504    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
E0213 15:54:43.079495    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
E0213 15:55:00.028678    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
E0213 15:55:01.173464    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/no-preload-355000/client.crt: no such file or directory
E0213 15:55:15.751304    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/old-k8s-version-481000/client.crt: no such file or directory
E0213 15:55:25.204684    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/flannel-599000/client.crt: no such file or directory
E0213 15:55:26.503121    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/auto-599000/client.crt: no such file or directory
E0213 15:55:46.802980    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kindnet-599000/client.crt: no such file or directory
E0213 15:56:07.842986    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/bridge-599000/client.crt: no such file or directory
E0213 15:56:23.096529    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/no-preload-355000/client.crt: no such file or directory
E0213 15:56:37.673833    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/old-k8s-version-481000/client.crt: no such file or directory
E0213 15:56:49.556064    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/auto-599000/client.crt: no such file or directory
E0213 15:57:04.527508    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kubenet-599000/client.crt: no such file or directory
E0213 15:57:09.856383    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kindnet-599000/client.crt: no such file or directory
E0213 15:57:16.917805    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/calico-599000/client.crt: no such file or directory
E0213 15:57:25.138502    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/custom-flannel-599000/client.crt: no such file or directory
start_stop_delete_test.go:256: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p default-k8s-diff-port-603000 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.28.4: exit status 80 (6m43.504120504s)

                                                
                                                
-- stdout --
	* [default-k8s-diff-port-603000] minikube v1.32.0 on Darwin 14.3.1
	  - MINIKUBE_LOCATION=18169
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18169-2790/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18169-2790/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on existing profile
	* Starting control plane node default-k8s-diff-port-603000 in cluster default-k8s-diff-port-603000
	* Restarting existing hyperkit VM for "default-k8s-diff-port-603000" ...
	* Preparing Kubernetes v1.28.4 on Docker 24.0.7 ...
	* Configuring bridge CNI (Container Networking Interface) ...
	* Verifying Kubernetes components...
	  - Using image fake.domain/registry.k8s.io/echoserver:1.4
	  - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	  - Using image docker.io/kubernetesui/dashboard:v2.7.0
	  - Using image registry.k8s.io/echoserver:1.4
	* Some dashboard features require the metrics-server addon. To enable all features please run:
	
		minikube -p default-k8s-diff-port-603000 addons enable metrics-server
	
	* Enabled addons: storage-provisioner, default-storageclass, metrics-server, dashboard
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0213 15:53:39.031988   10919 out.go:291] Setting OutFile to fd 1 ...
	I0213 15:53:39.032264   10919 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0213 15:53:39.032269   10919 out.go:304] Setting ErrFile to fd 2...
	I0213 15:53:39.032273   10919 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0213 15:53:39.032469   10919 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18169-2790/.minikube/bin
	I0213 15:53:39.033910   10919 out.go:298] Setting JSON to false
	I0213 15:53:39.057539   10919 start.go:128] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":4593,"bootTime":1707863826,"procs":449,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.3.1","kernelVersion":"23.3.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0213 15:53:39.057666   10919 start.go:136] gopshost.Virtualization returned error: not implemented yet
	I0213 15:53:39.079137   10919 out.go:177] * [default-k8s-diff-port-603000] minikube v1.32.0 on Darwin 14.3.1
	I0213 15:53:39.190219   10919 out.go:177]   - MINIKUBE_LOCATION=18169
	I0213 15:53:39.153162   10919 notify.go:220] Checking for updates...
	I0213 15:53:39.264965   10919 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/18169-2790/kubeconfig
	I0213 15:53:39.323341   10919 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0213 15:53:39.420312   10919 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0213 15:53:39.442029   10919 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18169-2790/.minikube
	I0213 15:53:39.484115   10919 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0213 15:53:39.505671   10919 config.go:182] Loaded profile config "default-k8s-diff-port-603000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
	I0213 15:53:39.506320   10919 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 15:53:39.506407   10919 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 15:53:39.515545   10919 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:57416
	I0213 15:53:39.515892   10919 main.go:141] libmachine: () Calling .GetVersion
	I0213 15:53:39.516349   10919 main.go:141] libmachine: Using API Version  1
	I0213 15:53:39.516379   10919 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 15:53:39.516594   10919 main.go:141] libmachine: () Calling .GetMachineName
	I0213 15:53:39.516701   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .DriverName
	I0213 15:53:39.516901   10919 driver.go:392] Setting default libvirt URI to qemu:///system
	I0213 15:53:39.517143   10919 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 15:53:39.517169   10919 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 15:53:39.525054   10919 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:57418
	I0213 15:53:39.525383   10919 main.go:141] libmachine: () Calling .GetVersion
	I0213 15:53:39.525755   10919 main.go:141] libmachine: Using API Version  1
	I0213 15:53:39.525771   10919 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 15:53:39.525999   10919 main.go:141] libmachine: () Calling .GetMachineName
	I0213 15:53:39.526104   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .DriverName
	I0213 15:53:39.555221   10919 out.go:177] * Using the hyperkit driver based on existing profile
	I0213 15:53:39.596896   10919 start.go:298] selected driver: hyperkit
	I0213 15:53:39.596919   10919 start.go:902] validating driver "hyperkit" against &{Name:default-k8s-diff-port-603000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/17866/minikube-v1.32.1-1703784139-17866-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1704759386-17866@sha256:8c3c33047f9bc285e1f5f2a5aa14744a2fe04c58478f02f77b06169dea8dd3f0 Memory:2200 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:
22 KubernetesConfig:{KubernetesVersion:v1.28.4 ClusterName:default-k8s-diff-port-603000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8444 NodeName:} Nodes:[{Name: IP:192.169.0.44 Port:8444 KubernetesVersion:v1.28.4 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[dashboard:true default-storageclass:true metrics-server:true storage-provisioner:true] CustomAddonImages:map[MetricsScraper:registry.k8s.io/echoserver:1.4 MetricsServer:registry.k8s.io/echoserver:1.4] CustomAddonRegistries:map[MetricsServer:fake.domain] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s Schedule
dStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs:}
	I0213 15:53:39.597143   10919 start.go:913] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0213 15:53:39.601254   10919 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0213 15:53:39.601354   10919 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/18169-2790/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0213 15:53:39.609098   10919 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.32.0
	I0213 15:53:39.614409   10919 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 15:53:39.614432   10919 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0213 15:53:39.614600   10919 start_flags.go:927] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0213 15:53:39.614663   10919 cni.go:84] Creating CNI manager for ""
	I0213 15:53:39.614676   10919 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I0213 15:53:39.614687   10919 start_flags.go:321] config:
	{Name:default-k8s-diff-port-603000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/17866/minikube-v1.32.1-1703784139-17866-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1704759386-17866@sha256:8c3c33047f9bc285e1f5f2a5aa14744a2fe04c58478f02f77b06169dea8dd3f0 Memory:2200 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.4 ClusterName:default-k8s-diff-port-6
03000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8444 NodeName:} Nodes:[{Name: IP:192.169.0.44 Port:8444 KubernetesVersion:v1.28.4 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[dashboard:true default-storageclass:true metrics-server:true storage-provisioner:true] CustomAddonImages:map[MetricsScraper:registry.k8s.io/echoserver:1.4 MetricsServer:registry.k8s.io/echoserver:1.4] CustomAddonRegistries:map[MetricsServer:fake.domain] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:fal
se ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs:}
	I0213 15:53:39.614825   10919 iso.go:125] acquiring lock: {Name:mk11c32e346f5bc1f067dee24ee83d9969db3d82 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0213 15:53:39.657344   10919 out.go:177] * Starting control plane node default-k8s-diff-port-603000 in cluster default-k8s-diff-port-603000
	I0213 15:53:39.678927   10919 preload.go:132] Checking if preload exists for k8s version v1.28.4 and runtime docker
	I0213 15:53:39.678993   10919 preload.go:148] Found local preload: /Users/jenkins/minikube-integration/18169-2790/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-docker-overlay2-amd64.tar.lz4
	I0213 15:53:39.679024   10919 cache.go:56] Caching tarball of preloaded images
	I0213 15:53:39.679212   10919 preload.go:174] Found /Users/jenkins/minikube-integration/18169-2790/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0213 15:53:39.679234   10919 cache.go:59] Finished verifying existence of preloaded tar for  v1.28.4 on docker
	I0213 15:53:39.679405   10919 profile.go:148] Saving config to /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/default-k8s-diff-port-603000/config.json ...
	I0213 15:53:39.680419   10919 start.go:365] acquiring machines lock for default-k8s-diff-port-603000: {Name:mke947868f35224fa4aab1d5f0a66de1e12a8270 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0213 15:53:39.680530   10919 start.go:369] acquired machines lock for "default-k8s-diff-port-603000" in 86.799µs
	I0213 15:53:39.680565   10919 start.go:96] Skipping create...Using existing machine configuration
	I0213 15:53:39.680578   10919 fix.go:54] fixHost starting: 
	I0213 15:53:39.680937   10919 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 15:53:39.680968   10919 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 15:53:39.689644   10919 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:57420
	I0213 15:53:39.690020   10919 main.go:141] libmachine: () Calling .GetVersion
	I0213 15:53:39.690404   10919 main.go:141] libmachine: Using API Version  1
	I0213 15:53:39.690422   10919 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 15:53:39.690618   10919 main.go:141] libmachine: () Calling .GetMachineName
	I0213 15:53:39.690705   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .DriverName
	I0213 15:53:39.690791   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetState
	I0213 15:53:39.690871   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 15:53:39.690932   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | hyperkit pid from json: 10848
	I0213 15:53:39.691924   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | hyperkit pid 10848 missing from process table
	I0213 15:53:39.691960   10919 fix.go:102] recreateIfNeeded on default-k8s-diff-port-603000: state=Stopped err=<nil>
	I0213 15:53:39.691986   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .DriverName
	W0213 15:53:39.692070   10919 fix.go:128] unexpected machine state, will restart: <nil>
	I0213 15:53:39.734051   10919 out.go:177] * Restarting existing hyperkit VM for "default-k8s-diff-port-603000" ...
	I0213 15:53:39.755198   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .Start
	I0213 15:53:39.755501   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 15:53:39.755574   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000/hyperkit.pid
	I0213 15:53:39.757088   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | hyperkit pid 10848 missing from process table
	I0213 15:53:39.757109   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | pid 10848 is in state "Stopped"
	I0213 15:53:39.757128   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | Removing stale pid file /Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000/hyperkit.pid...
	I0213 15:53:39.757441   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | Using UUID 32f93a59-b32d-4016-9ad2-e6755b97ad7f
	I0213 15:53:39.781929   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | Generated MAC ba:b2:e8:3b:2f:ed
	I0213 15:53:39.781963   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=default-k8s-diff-port-603000
	I0213 15:53:39.782101   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | 2024/02/13 15:53:39 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"32f93a59-b32d-4016-9ad2-e6755b97ad7f", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0004f7260)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pi
d:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0213 15:53:39.782166   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | 2024/02/13 15:53:39 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"32f93a59-b32d-4016-9ad2-e6755b97ad7f", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0004f7260)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pi
d:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0213 15:53:39.782205   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | 2024/02/13 15:53:39 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "32f93a59-b32d-4016-9ad2-e6755b97ad7f", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000/default-k8s-diff-port-603000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000/tty,log=/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18169-2790/
.minikube/machines/default-k8s-diff-port-603000/bzimage,/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=default-k8s-diff-port-603000"}
	I0213 15:53:39.782237   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | 2024/02/13 15:53:39 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 32f93a59-b32d-4016-9ad2-e6755b97ad7f -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000/default-k8s-diff-port-603000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000/tty,log=/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000/console-ring -f kexec,/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000/bzimage,/Users
/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=default-k8s-diff-port-603000"
	I0213 15:53:39.782252   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | 2024/02/13 15:53:39 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0213 15:53:39.783692   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | 2024/02/13 15:53:39 DEBUG: hyperkit: Pid is 10930
	I0213 15:53:39.784171   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | Attempt 0
	I0213 15:53:39.784194   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 15:53:39.784284   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | hyperkit pid from json: 10930
	I0213 15:53:39.785883   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | Searching for ba:b2:e8:3b:2f:ed in /var/db/dhcpd_leases ...
	I0213 15:53:39.785972   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | Found 43 entries in /var/db/dhcpd_leases!
	I0213 15:53:39.785996   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.43 HWAddress:22:34:c6:19:4c:2e ID:1,22:34:c6:19:4c:2e Lease:0x65cd524f}
	I0213 15:53:39.786010   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.44 HWAddress:ba:b2:e8:3b:2f:ed ID:1,ba:b2:e8:3b:2f:ed Lease:0x65cd5243}
	I0213 15:53:39.786024   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | Found match: ba:b2:e8:3b:2f:ed
	I0213 15:53:39.786040   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | IP: 192.169.0.44
	I0213 15:53:39.786086   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetConfigRaw
	I0213 15:53:39.786713   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetIP
	I0213 15:53:39.786879   10919 profile.go:148] Saving config to /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/default-k8s-diff-port-603000/config.json ...
	I0213 15:53:39.787229   10919 machine.go:88] provisioning docker machine ...
	I0213 15:53:39.787240   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .DriverName
	I0213 15:53:39.787370   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetMachineName
	I0213 15:53:39.787487   10919 buildroot.go:166] provisioning hostname "default-k8s-diff-port-603000"
	I0213 15:53:39.787501   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetMachineName
	I0213 15:53:39.787627   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHHostname
	I0213 15:53:39.787759   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHPort
	I0213 15:53:39.787877   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHKeyPath
	I0213 15:53:39.787981   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHKeyPath
	I0213 15:53:39.788077   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHUsername
	I0213 15:53:39.788507   10919 main.go:141] libmachine: Using SSH client type: native
	I0213 15:53:39.788816   10919 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1407080] 0x1409d60 <nil>  [] 0s} 192.169.0.44 22 <nil> <nil>}
	I0213 15:53:39.788826   10919 main.go:141] libmachine: About to run SSH command:
	sudo hostname default-k8s-diff-port-603000 && echo "default-k8s-diff-port-603000" | sudo tee /etc/hostname
	I0213 15:53:39.791879   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | 2024/02/13 15:53:39 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0213 15:53:39.800677   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | 2024/02/13 15:53:39 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0213 15:53:39.801580   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | 2024/02/13 15:53:39 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0213 15:53:39.801604   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | 2024/02/13 15:53:39 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0213 15:53:39.801671   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | 2024/02/13 15:53:39 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0213 15:53:39.801711   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | 2024/02/13 15:53:39 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0213 15:53:40.172267   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | 2024/02/13 15:53:40 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0213 15:53:40.172282   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | 2024/02/13 15:53:40 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0213 15:53:40.276330   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | 2024/02/13 15:53:40 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0213 15:53:40.276350   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | 2024/02/13 15:53:40 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0213 15:53:40.276362   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | 2024/02/13 15:53:40 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0213 15:53:40.276380   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | 2024/02/13 15:53:40 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0213 15:53:40.277255   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | 2024/02/13 15:53:40 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0213 15:53:40.277269   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | 2024/02/13 15:53:40 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0213 15:53:45.383748   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | 2024/02/13 15:53:45 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0213 15:53:45.383806   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | 2024/02/13 15:53:45 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0213 15:53:45.383817   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | 2024/02/13 15:53:45 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0213 15:53:53.129932   10919 main.go:141] libmachine: SSH cmd err, output: <nil>: default-k8s-diff-port-603000
	
	I0213 15:53:53.129954   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHHostname
	I0213 15:53:53.130090   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHPort
	I0213 15:53:53.130188   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHKeyPath
	I0213 15:53:53.130301   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHKeyPath
	I0213 15:53:53.130395   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHUsername
	I0213 15:53:53.130523   10919 main.go:141] libmachine: Using SSH client type: native
	I0213 15:53:53.130792   10919 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1407080] 0x1409d60 <nil>  [] 0s} 192.169.0.44 22 <nil> <nil>}
	I0213 15:53:53.130805   10919 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sdefault-k8s-diff-port-603000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 default-k8s-diff-port-603000/g' /etc/hosts;
				else 
					echo '127.0.1.1 default-k8s-diff-port-603000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0213 15:53:53.204739   10919 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0213 15:53:53.204759   10919 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18169-2790/.minikube CaCertPath:/Users/jenkins/minikube-integration/18169-2790/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18169-2790/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18169-2790/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18169-2790/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18169-2790/.minikube}
	I0213 15:53:53.204775   10919 buildroot.go:174] setting up certificates
	I0213 15:53:53.204788   10919 provision.go:83] configureAuth start
	I0213 15:53:53.204799   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetMachineName
	I0213 15:53:53.204942   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetIP
	I0213 15:53:53.205035   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHHostname
	I0213 15:53:53.205109   10919 provision.go:138] copyHostCerts
	I0213 15:53:53.205198   10919 exec_runner.go:144] found /Users/jenkins/minikube-integration/18169-2790/.minikube/key.pem, removing ...
	I0213 15:53:53.205209   10919 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18169-2790/.minikube/key.pem
	I0213 15:53:53.205346   10919 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18169-2790/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18169-2790/.minikube/key.pem (1679 bytes)
	I0213 15:53:53.205591   10919 exec_runner.go:144] found /Users/jenkins/minikube-integration/18169-2790/.minikube/ca.pem, removing ...
	I0213 15:53:53.205597   10919 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18169-2790/.minikube/ca.pem
	I0213 15:53:53.205675   10919 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18169-2790/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18169-2790/.minikube/ca.pem (1082 bytes)
	I0213 15:53:53.205856   10919 exec_runner.go:144] found /Users/jenkins/minikube-integration/18169-2790/.minikube/cert.pem, removing ...
	I0213 15:53:53.205862   10919 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18169-2790/.minikube/cert.pem
	I0213 15:53:53.205935   10919 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18169-2790/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18169-2790/.minikube/cert.pem (1123 bytes)
	I0213 15:53:53.206086   10919 provision.go:112] generating server cert: /Users/jenkins/minikube-integration/18169-2790/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18169-2790/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18169-2790/.minikube/certs/ca-key.pem org=jenkins.default-k8s-diff-port-603000 san=[192.169.0.44 192.169.0.44 localhost 127.0.0.1 minikube default-k8s-diff-port-603000]
	I0213 15:53:53.401997   10919 provision.go:172] copyRemoteCerts
	I0213 15:53:53.402108   10919 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0213 15:53:53.402136   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHHostname
	I0213 15:53:53.402328   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHPort
	I0213 15:53:53.402461   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHKeyPath
	I0213 15:53:53.402581   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHUsername
	I0213 15:53:53.402725   10919 sshutil.go:53] new ssh client: &{IP:192.169.0.44 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000/id_rsa Username:docker}
	I0213 15:53:53.442436   10919 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18169-2790/.minikube/machines/server.pem --> /etc/docker/server.pem (1257 bytes)
	I0213 15:53:53.458671   10919 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18169-2790/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0213 15:53:53.474647   10919 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18169-2790/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0213 15:53:53.490406   10919 provision.go:86] duration metric: configureAuth took 285.601195ms
	I0213 15:53:53.490423   10919 buildroot.go:189] setting minikube options for container-runtime
	I0213 15:53:53.490566   10919 config.go:182] Loaded profile config "default-k8s-diff-port-603000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
	I0213 15:53:53.490579   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .DriverName
	I0213 15:53:53.490724   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHHostname
	I0213 15:53:53.490843   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHPort
	I0213 15:53:53.490955   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHKeyPath
	I0213 15:53:53.491050   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHKeyPath
	I0213 15:53:53.491131   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHUsername
	I0213 15:53:53.491247   10919 main.go:141] libmachine: Using SSH client type: native
	I0213 15:53:53.491487   10919 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1407080] 0x1409d60 <nil>  [] 0s} 192.169.0.44 22 <nil> <nil>}
	I0213 15:53:53.491498   10919 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0213 15:53:53.561625   10919 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0213 15:53:53.561643   10919 buildroot.go:70] root file system type: tmpfs
	I0213 15:53:53.561707   10919 provision.go:309] Updating docker unit: /lib/systemd/system/docker.service ...
	I0213 15:53:53.561723   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHHostname
	I0213 15:53:53.561854   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHPort
	I0213 15:53:53.561947   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHKeyPath
	I0213 15:53:53.562035   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHKeyPath
	I0213 15:53:53.562115   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHUsername
	I0213 15:53:53.562252   10919 main.go:141] libmachine: Using SSH client type: native
	I0213 15:53:53.562507   10919 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1407080] 0x1409d60 <nil>  [] 0s} 192.169.0.44 22 <nil> <nil>}
	I0213 15:53:53.562559   10919 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0213 15:53:53.638952   10919 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0213 15:53:53.638977   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHHostname
	I0213 15:53:53.639129   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHPort
	I0213 15:53:53.639229   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHKeyPath
	I0213 15:53:53.639323   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHKeyPath
	I0213 15:53:53.639414   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHUsername
	I0213 15:53:53.639534   10919 main.go:141] libmachine: Using SSH client type: native
	I0213 15:53:53.639789   10919 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1407080] 0x1409d60 <nil>  [] 0s} 192.169.0.44 22 <nil> <nil>}
	I0213 15:53:53.639805   10919 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0213 15:53:54.256176   10919 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0213 15:53:54.256192   10919 machine.go:91] provisioned docker machine in 14.312422836s
	I0213 15:53:54.256204   10919 start.go:300] post-start starting for "default-k8s-diff-port-603000" (driver="hyperkit")
	I0213 15:53:54.256211   10919 start.go:329] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0213 15:53:54.256223   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .DriverName
	I0213 15:53:54.256409   10919 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0213 15:53:54.256421   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHHostname
	I0213 15:53:54.256510   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHPort
	I0213 15:53:54.256596   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHKeyPath
	I0213 15:53:54.256686   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHUsername
	I0213 15:53:54.256771   10919 sshutil.go:53] new ssh client: &{IP:192.169.0.44 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000/id_rsa Username:docker}
	I0213 15:53:54.296038   10919 ssh_runner.go:195] Run: cat /etc/os-release
	I0213 15:53:54.298916   10919 info.go:137] Remote host: Buildroot 2021.02.12
	I0213 15:53:54.298931   10919 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18169-2790/.minikube/addons for local assets ...
	I0213 15:53:54.299025   10919 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18169-2790/.minikube/files for local assets ...
	I0213 15:53:54.299619   10919 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18169-2790/.minikube/files/etc/ssl/certs/33422.pem -> 33422.pem in /etc/ssl/certs
	I0213 15:53:54.299828   10919 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0213 15:53:54.305755   10919 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18169-2790/.minikube/files/etc/ssl/certs/33422.pem --> /etc/ssl/certs/33422.pem (1708 bytes)
	I0213 15:53:54.322584   10919 start.go:303] post-start completed in 66.370514ms
	I0213 15:53:54.322596   10919 fix.go:56] fixHost completed within 14.485492524s
	I0213 15:53:54.322613   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHHostname
	I0213 15:53:54.322749   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHPort
	I0213 15:53:54.322846   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHKeyPath
	I0213 15:53:54.322923   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHKeyPath
	I0213 15:53:54.323044   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHUsername
	I0213 15:53:54.323167   10919 main.go:141] libmachine: Using SSH client type: native
	I0213 15:53:54.323414   10919 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1407080] 0x1409d60 <nil>  [] 0s} 192.169.0.44 22 <nil> <nil>}
	I0213 15:53:54.323422   10919 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0213 15:53:54.392058   10919 main.go:141] libmachine: SSH cmd err, output: <nil>: 1707868434.040583241
	
	I0213 15:53:54.392074   10919 fix.go:206] guest clock: 1707868434.040583241
	I0213 15:53:54.392079   10919 fix.go:219] Guest: 2024-02-13 15:53:54.040583241 -0800 PST Remote: 2024-02-13 15:53:54.322599 -0800 PST m=+15.178802027 (delta=-282.015759ms)
	I0213 15:53:54.392097   10919 fix.go:190] guest clock delta is within tolerance: -282.015759ms
	I0213 15:53:54.392105   10919 start.go:83] releasing machines lock for "default-k8s-diff-port-603000", held for 14.555032528s
	I0213 15:53:54.392123   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .DriverName
	I0213 15:53:54.392252   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetIP
	I0213 15:53:54.392355   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .DriverName
	I0213 15:53:54.392633   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .DriverName
	I0213 15:53:54.392722   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .DriverName
	I0213 15:53:54.392913   10919 ssh_runner.go:195] Run: cat /version.json
	I0213 15:53:54.392925   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHHostname
	I0213 15:53:54.393010   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHPort
	I0213 15:53:54.393100   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHKeyPath
	I0213 15:53:54.393183   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHUsername
	I0213 15:53:54.393267   10919 sshutil.go:53] new ssh client: &{IP:192.169.0.44 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000/id_rsa Username:docker}
	I0213 15:53:54.393298   10919 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0213 15:53:54.393327   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHHostname
	I0213 15:53:54.393405   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHPort
	I0213 15:53:54.393491   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHKeyPath
	I0213 15:53:54.393568   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHUsername
	I0213 15:53:54.393653   10919 sshutil.go:53] new ssh client: &{IP:192.169.0.44 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000/id_rsa Username:docker}
	I0213 15:53:54.429400   10919 ssh_runner.go:195] Run: systemctl --version
	I0213 15:53:54.433656   10919 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0213 15:53:54.478518   10919 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0213 15:53:54.478596   10919 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0213 15:53:54.489284   10919 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0213 15:53:54.489301   10919 start.go:475] detecting cgroup driver to use...
	I0213 15:53:54.489412   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0213 15:53:54.501085   10919 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0213 15:53:54.507772   10919 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0213 15:53:54.514300   10919 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0213 15:53:54.514354   10919 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0213 15:53:54.520792   10919 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0213 15:53:54.527695   10919 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0213 15:53:54.534329   10919 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0213 15:53:54.540993   10919 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0213 15:53:54.547829   10919 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0213 15:53:54.554403   10919 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0213 15:53:54.560497   10919 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0213 15:53:54.566471   10919 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0213 15:53:54.647368   10919 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0213 15:53:54.660182   10919 start.go:475] detecting cgroup driver to use...
	I0213 15:53:54.660255   10919 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0213 15:53:54.670561   10919 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0213 15:53:54.681745   10919 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0213 15:53:54.696011   10919 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0213 15:53:54.704921   10919 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0213 15:53:54.713909   10919 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0213 15:53:54.735393   10919 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0213 15:53:54.744506   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0213 15:53:54.757363   10919 ssh_runner.go:195] Run: which cri-dockerd
	I0213 15:53:54.759672   10919 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0213 15:53:54.765360   10919 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0213 15:53:54.776919   10919 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0213 15:53:54.862744   10919 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0213 15:53:54.950800   10919 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0213 15:53:54.950870   10919 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0213 15:53:54.962250   10919 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0213 15:53:55.048693   10919 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0213 15:53:56.361309   10919 ssh_runner.go:235] Completed: sudo systemctl restart docker: (1.31258068s)
	I0213 15:53:56.361374   10919 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0213 15:53:56.370115   10919 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0213 15:53:56.380123   10919 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0213 15:53:56.389025   10919 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0213 15:53:56.473016   10919 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0213 15:53:56.562333   10919 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0213 15:53:56.647853   10919 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0213 15:53:56.658711   10919 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0213 15:53:56.667781   10919 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0213 15:53:56.753140   10919 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0213 15:53:56.807267   10919 start.go:522] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0213 15:53:56.807343   10919 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0213 15:53:56.811238   10919 start.go:543] Will wait 60s for crictl version
	I0213 15:53:56.811287   10919 ssh_runner.go:195] Run: which crictl
	I0213 15:53:56.813700   10919 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0213 15:53:56.848539   10919 start.go:559] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  24.0.7
	RuntimeApiVersion:  v1
	I0213 15:53:56.848619   10919 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0213 15:53:56.865144   10919 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0213 15:53:56.925620   10919 out.go:204] * Preparing Kubernetes v1.28.4 on Docker 24.0.7 ...
	I0213 15:53:56.925646   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetIP
	I0213 15:53:56.925846   10919 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0213 15:53:56.928316   10919 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0213 15:53:56.936061   10919 preload.go:132] Checking if preload exists for k8s version v1.28.4 and runtime docker
	I0213 15:53:56.936126   10919 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0213 15:53:56.949192   10919 docker.go:685] Got preloaded images: -- stdout --
	registry.k8s.io/kube-apiserver:v1.28.4
	registry.k8s.io/kube-proxy:v1.28.4
	registry.k8s.io/kube-controller-manager:v1.28.4
	registry.k8s.io/kube-scheduler:v1.28.4
	registry.k8s.io/etcd:3.5.9-0
	registry.k8s.io/coredns/coredns:v1.10.1
	registry.k8s.io/pause:3.9
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28.4-glibc
	
	-- /stdout --
	I0213 15:53:56.949213   10919 docker.go:615] Images already preloaded, skipping extraction
	I0213 15:53:56.949281   10919 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0213 15:53:56.962613   10919 docker.go:685] Got preloaded images: -- stdout --
	registry.k8s.io/kube-apiserver:v1.28.4
	registry.k8s.io/kube-proxy:v1.28.4
	registry.k8s.io/kube-controller-manager:v1.28.4
	registry.k8s.io/kube-scheduler:v1.28.4
	registry.k8s.io/etcd:3.5.9-0
	registry.k8s.io/coredns/coredns:v1.10.1
	registry.k8s.io/pause:3.9
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28.4-glibc
	
	-- /stdout --
	I0213 15:53:56.962637   10919 cache_images.go:84] Images are preloaded, skipping loading
	I0213 15:53:56.962709   10919 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0213 15:53:56.981130   10919 cni.go:84] Creating CNI manager for ""
	I0213 15:53:56.981144   10919 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I0213 15:53:56.981157   10919 kubeadm.go:87] Using pod CIDR: 10.244.0.0/16
	I0213 15:53:56.981171   10919 kubeadm.go:176] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.169.0.44 APIServerPort:8444 KubernetesVersion:v1.28.4 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:default-k8s-diff-port-603000 NodeName:default-k8s-diff-port-603000 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.169.0.44"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.169.0.44 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/c
a.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0213 15:53:56.981263   10919 kubeadm.go:181] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.169.0.44
	  bindPort: 8444
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/cri-dockerd.sock
	  name: "default-k8s-diff-port-603000"
	  kubeletExtraArgs:
	    node-ip: 192.169.0.44
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.169.0.44"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8444
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.28.4
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0213 15:53:56.981327   10919 kubeadm.go:976] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.28.4/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --container-runtime-endpoint=unix:///var/run/cri-dockerd.sock --hostname-override=default-k8s-diff-port-603000 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.44
	
	[Install]
	 config:
	{KubernetesVersion:v1.28.4 ClusterName:default-k8s-diff-port-603000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8444 NodeName:}
	I0213 15:53:56.981386   10919 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.28.4
	I0213 15:53:56.987861   10919 binaries.go:44] Found k8s binaries, skipping transfer
	I0213 15:53:56.987909   10919 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I0213 15:53:56.994107   10919 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (389 bytes)
	I0213 15:53:57.005074   10919 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0213 15:53:57.015904   10919 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2111 bytes)
	I0213 15:53:57.027200   10919 ssh_runner.go:195] Run: grep 192.169.0.44	control-plane.minikube.internal$ /etc/hosts
	I0213 15:53:57.029423   10919 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.44	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0213 15:53:57.037278   10919 certs.go:56] Setting up /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/default-k8s-diff-port-603000 for IP: 192.169.0.44
	I0213 15:53:57.037296   10919 certs.go:190] acquiring lock for shared ca certs: {Name:mkbda05235901fe7fd4e84a9c5103764710e2c54 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0213 15:53:57.037475   10919 certs.go:199] skipping minikubeCA CA generation: /Users/jenkins/minikube-integration/18169-2790/.minikube/ca.key
	I0213 15:53:57.037544   10919 certs.go:199] skipping proxyClientCA CA generation: /Users/jenkins/minikube-integration/18169-2790/.minikube/proxy-client-ca.key
	I0213 15:53:57.037657   10919 certs.go:315] skipping minikube-user signed cert generation: /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/default-k8s-diff-port-603000/client.key
	I0213 15:53:57.037736   10919 certs.go:315] skipping minikube signed cert generation: /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/default-k8s-diff-port-603000/apiserver.key.46a96eba
	I0213 15:53:57.037813   10919 certs.go:315] skipping aggregator signed cert generation: /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/default-k8s-diff-port-603000/proxy-client.key
	I0213 15:53:57.038074   10919 certs.go:437] found cert: /Users/jenkins/minikube-integration/18169-2790/.minikube/certs/Users/jenkins/minikube-integration/18169-2790/.minikube/certs/3342.pem (1338 bytes)
	W0213 15:53:57.038126   10919 certs.go:433] ignoring /Users/jenkins/minikube-integration/18169-2790/.minikube/certs/Users/jenkins/minikube-integration/18169-2790/.minikube/certs/3342_empty.pem, impossibly tiny 0 bytes
	I0213 15:53:57.038135   10919 certs.go:437] found cert: /Users/jenkins/minikube-integration/18169-2790/.minikube/certs/Users/jenkins/minikube-integration/18169-2790/.minikube/certs/ca-key.pem (1679 bytes)
	I0213 15:53:57.038177   10919 certs.go:437] found cert: /Users/jenkins/minikube-integration/18169-2790/.minikube/certs/Users/jenkins/minikube-integration/18169-2790/.minikube/certs/ca.pem (1082 bytes)
	I0213 15:53:57.038206   10919 certs.go:437] found cert: /Users/jenkins/minikube-integration/18169-2790/.minikube/certs/Users/jenkins/minikube-integration/18169-2790/.minikube/certs/cert.pem (1123 bytes)
	I0213 15:53:57.038238   10919 certs.go:437] found cert: /Users/jenkins/minikube-integration/18169-2790/.minikube/certs/Users/jenkins/minikube-integration/18169-2790/.minikube/certs/key.pem (1679 bytes)
	I0213 15:53:57.038303   10919 certs.go:437] found cert: /Users/jenkins/minikube-integration/18169-2790/.minikube/files/etc/ssl/certs/Users/jenkins/minikube-integration/18169-2790/.minikube/files/etc/ssl/certs/33422.pem (1708 bytes)
	I0213 15:53:57.038838   10919 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/default-k8s-diff-port-603000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1399 bytes)
	I0213 15:53:57.055057   10919 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/default-k8s-diff-port-603000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0213 15:53:57.070955   10919 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/default-k8s-diff-port-603000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0213 15:53:57.086776   10919 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/default-k8s-diff-port-603000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0213 15:53:57.102413   10919 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18169-2790/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0213 15:53:57.118106   10919 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18169-2790/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0213 15:53:57.133559   10919 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18169-2790/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0213 15:53:57.149457   10919 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18169-2790/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0213 15:53:57.165386   10919 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18169-2790/.minikube/files/etc/ssl/certs/33422.pem --> /usr/share/ca-certificates/33422.pem (1708 bytes)
	I0213 15:53:57.180909   10919 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18169-2790/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0213 15:53:57.196600   10919 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18169-2790/.minikube/certs/3342.pem --> /usr/share/ca-certificates/3342.pem (1338 bytes)
	I0213 15:53:57.211898   10919 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0213 15:53:57.223154   10919 ssh_runner.go:195] Run: openssl version
	I0213 15:53:57.226592   10919 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/33422.pem && ln -fs /usr/share/ca-certificates/33422.pem /etc/ssl/certs/33422.pem"
	I0213 15:53:57.233826   10919 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/33422.pem
	I0213 15:53:57.236759   10919 certs.go:480] hashing: -rw-r--r-- 1 root root 1708 Feb 13 22:57 /usr/share/ca-certificates/33422.pem
	I0213 15:53:57.236799   10919 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/33422.pem
	I0213 15:53:57.240202   10919 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/33422.pem /etc/ssl/certs/3ec20f2e.0"
	I0213 15:53:57.247175   10919 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0213 15:53:57.254335   10919 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0213 15:53:57.257324   10919 certs.go:480] hashing: -rw-r--r-- 1 root root 1111 Feb 13 22:50 /usr/share/ca-certificates/minikubeCA.pem
	I0213 15:53:57.257364   10919 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0213 15:53:57.260952   10919 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0213 15:53:57.268141   10919 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/3342.pem && ln -fs /usr/share/ca-certificates/3342.pem /etc/ssl/certs/3342.pem"
	I0213 15:53:57.275653   10919 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3342.pem
	I0213 15:53:57.278571   10919 certs.go:480] hashing: -rw-r--r-- 1 root root 1338 Feb 13 22:57 /usr/share/ca-certificates/3342.pem
	I0213 15:53:57.278617   10919 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3342.pem
	I0213 15:53:57.282147   10919 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/3342.pem /etc/ssl/certs/51391683.0"
	I0213 15:53:57.289147   10919 ssh_runner.go:195] Run: ls /var/lib/minikube/certs/etcd
	I0213 15:53:57.291825   10919 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0213 15:53:57.295407   10919 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0213 15:53:57.298948   10919 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0213 15:53:57.302451   10919 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0213 15:53:57.305905   10919 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0213 15:53:57.309496   10919 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0213 15:53:57.313095   10919 kubeadm.go:404] StartCluster: {Name:default-k8s-diff-port-603000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/17866/minikube-v1.32.1-1703784139-17866-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1704759386-17866@sha256:8c3c33047f9bc285e1f5f2a5aa14744a2fe04c58478f02f77b06169dea8dd3f0 Memory:2200 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{K
ubernetesVersion:v1.28.4 ClusterName:default-k8s-diff-port-603000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8444 NodeName:} Nodes:[{Name: IP:192.169.0.44 Port:8444 KubernetesVersion:v1.28.4 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[dashboard:true default-storageclass:true metrics-server:true storage-provisioner:true] CustomAddonImages:map[MetricsScraper:registry.k8s.io/echoserver:1.4 MetricsServer:registry.k8s.io/echoserver:1.4] CustomAddonRegistries:map[MetricsServer:fake.domain] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPor
ts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs:}
	I0213 15:53:57.313191   10919 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0213 15:53:57.325815   10919 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0213 15:53:57.332190   10919 kubeadm.go:419] found existing configuration files, will attempt cluster restart
	I0213 15:53:57.332204   10919 kubeadm.go:636] restartCluster start
	I0213 15:53:57.332252   10919 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I0213 15:53:57.338407   10919 kubeadm.go:127] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I0213 15:53:57.338972   10919 kubeconfig.go:135] verify returned: extract IP: "default-k8s-diff-port-603000" does not appear in /Users/jenkins/minikube-integration/18169-2790/kubeconfig
	I0213 15:53:57.339268   10919 kubeconfig.go:146] "default-k8s-diff-port-603000" context is missing from /Users/jenkins/minikube-integration/18169-2790/kubeconfig - will repair!
	I0213 15:53:57.339802   10919 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18169-2790/kubeconfig: {Name:mkf6bdf8196211b20577d90f94d0007015c44956 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0213 15:53:57.341478   10919 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I0213 15:53:57.347588   10919 api_server.go:166] Checking apiserver status ...
	I0213 15:53:57.347638   10919 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0213 15:53:57.356207   10919 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0213 15:53:57.849138   10919 api_server.go:166] Checking apiserver status ...
	I0213 15:53:57.849284   10919 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0213 15:53:57.859026   10919 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0213 15:53:58.347944   10919 api_server.go:166] Checking apiserver status ...
	I0213 15:53:58.348042   10919 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0213 15:53:58.357033   10919 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0213 15:53:58.848065   10919 api_server.go:166] Checking apiserver status ...
	I0213 15:53:58.848247   10919 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0213 15:53:58.857167   10919 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0213 15:53:59.348553   10919 api_server.go:166] Checking apiserver status ...
	I0213 15:53:59.348621   10919 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0213 15:53:59.356963   10919 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0213 15:53:59.848691   10919 api_server.go:166] Checking apiserver status ...
	I0213 15:53:59.848816   10919 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0213 15:53:59.857660   10919 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0213 15:54:00.347967   10919 api_server.go:166] Checking apiserver status ...
	I0213 15:54:00.348088   10919 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0213 15:54:00.357528   10919 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0213 15:54:00.847798   10919 api_server.go:166] Checking apiserver status ...
	I0213 15:54:00.847904   10919 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0213 15:54:00.858156   10919 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0213 15:54:01.349046   10919 api_server.go:166] Checking apiserver status ...
	I0213 15:54:01.349141   10919 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0213 15:54:01.357799   10919 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0213 15:54:01.848407   10919 api_server.go:166] Checking apiserver status ...
	I0213 15:54:01.848523   10919 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0213 15:54:01.857982   10919 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0213 15:54:02.347851   10919 api_server.go:166] Checking apiserver status ...
	I0213 15:54:02.347953   10919 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0213 15:54:02.356556   10919 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0213 15:54:02.848314   10919 api_server.go:166] Checking apiserver status ...
	I0213 15:54:02.848399   10919 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0213 15:54:02.857593   10919 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0213 15:54:03.348073   10919 api_server.go:166] Checking apiserver status ...
	I0213 15:54:03.348236   10919 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0213 15:54:03.357705   10919 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0213 15:54:03.847776   10919 api_server.go:166] Checking apiserver status ...
	I0213 15:54:03.847866   10919 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0213 15:54:03.856640   10919 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0213 15:54:04.348125   10919 api_server.go:166] Checking apiserver status ...
	I0213 15:54:04.348207   10919 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0213 15:54:04.357207   10919 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0213 15:54:04.848413   10919 api_server.go:166] Checking apiserver status ...
	I0213 15:54:04.848510   10919 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0213 15:54:04.858765   10919 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0213 15:54:05.347784   10919 api_server.go:166] Checking apiserver status ...
	I0213 15:54:05.347853   10919 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0213 15:54:05.356465   10919 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0213 15:54:05.849535   10919 api_server.go:166] Checking apiserver status ...
	I0213 15:54:05.849643   10919 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0213 15:54:05.858336   10919 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0213 15:54:06.348297   10919 api_server.go:166] Checking apiserver status ...
	I0213 15:54:06.348394   10919 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0213 15:54:06.357691   10919 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0213 15:54:06.849840   10919 api_server.go:166] Checking apiserver status ...
	I0213 15:54:06.849958   10919 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0213 15:54:06.859747   10919 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0213 15:54:07.347897   10919 api_server.go:166] Checking apiserver status ...
	I0213 15:54:07.347995   10919 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0213 15:54:07.356960   10919 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0213 15:54:07.356975   10919 kubeadm.go:611] needs reconfigure: apiserver error: context deadline exceeded
	I0213 15:54:07.356989   10919 kubeadm.go:1135] stopping kube-system containers ...
	I0213 15:54:07.357062   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0213 15:54:07.371229   10919 docker.go:483] Stopping containers: [d6d6bfc7e46f 441ed6acf02b 8fc06faf5daf cf10ab2a5821 7f2607327b51 166517917ebe fd971f8d8ee1 7516f7e0a7b3 33e10bdbe674 69aac7de960e f753753d8cc1 9108c4562e79 cad1f358b39e 68fe27be6ecf d475108f0118]
	I0213 15:54:07.371307   10919 ssh_runner.go:195] Run: docker stop d6d6bfc7e46f 441ed6acf02b 8fc06faf5daf cf10ab2a5821 7f2607327b51 166517917ebe fd971f8d8ee1 7516f7e0a7b3 33e10bdbe674 69aac7de960e f753753d8cc1 9108c4562e79 cad1f358b39e 68fe27be6ecf d475108f0118
	I0213 15:54:07.384947   10919 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I0213 15:54:07.396080   10919 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I0213 15:54:07.402566   10919 kubeadm.go:152] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I0213 15:54:07.402612   10919 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I0213 15:54:07.408981   10919 kubeadm.go:713] reconfiguring cluster from /var/tmp/minikube/kubeadm.yaml
	I0213 15:54:07.408990   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.28.4:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I0213 15:54:07.480653   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.28.4:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I0213 15:54:08.092528   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.28.4:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I0213 15:54:08.226801   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.28.4:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I0213 15:54:08.291048   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.28.4:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I0213 15:54:08.353803   10919 api_server.go:52] waiting for apiserver process to appear ...
	I0213 15:54:08.353867   10919 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0213 15:54:08.855209   10919 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0213 15:54:09.354261   10919 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0213 15:54:09.853975   10919 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0213 15:54:09.878931   10919 api_server.go:72] duration metric: took 1.525095463s to wait for apiserver process to appear ...
	I0213 15:54:09.878951   10919 api_server.go:88] waiting for apiserver healthz status ...
	I0213 15:54:09.878977   10919 api_server.go:253] Checking apiserver healthz at https://192.169.0.44:8444/healthz ...
	I0213 15:54:12.727324   10919 api_server.go:279] https://192.169.0.44:8444/healthz returned 403:
	{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/healthz\"","reason":"Forbidden","details":{},"code":403}
	W0213 15:54:12.727343   10919 api_server.go:103] status: https://192.169.0.44:8444/healthz returned error 403:
	{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/healthz\"","reason":"Forbidden","details":{},"code":403}
	I0213 15:54:12.727354   10919 api_server.go:253] Checking apiserver healthz at https://192.169.0.44:8444/healthz ...
	I0213 15:54:12.756415   10919 api_server.go:279] https://192.169.0.44:8444/healthz returned 403:
	{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/healthz\"","reason":"Forbidden","details":{},"code":403}
	W0213 15:54:12.756434   10919 api_server.go:103] status: https://192.169.0.44:8444/healthz returned error 403:
	{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/healthz\"","reason":"Forbidden","details":{},"code":403}
	I0213 15:54:12.879358   10919 api_server.go:253] Checking apiserver healthz at https://192.169.0.44:8444/healthz ...
	I0213 15:54:12.883398   10919 api_server.go:279] https://192.169.0.44:8444/healthz returned 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/start-service-ip-repair-controllers ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-system-namespaces-controller ok
	[+]poststarthook/bootstrap-controller ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-controller ok
	[+]poststarthook/start-deprecated-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-legacy-token-tracking-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	[+]poststarthook/apiservice-discovery-controller ok
	healthz check failed
	W0213 15:54:12.883419   10919 api_server.go:103] status: https://192.169.0.44:8444/healthz returned error 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/start-service-ip-repair-controllers ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-system-namespaces-controller ok
	[+]poststarthook/bootstrap-controller ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-controller ok
	[+]poststarthook/start-deprecated-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-legacy-token-tracking-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	[+]poststarthook/apiservice-discovery-controller ok
	healthz check failed
	I0213 15:54:13.380763   10919 api_server.go:253] Checking apiserver healthz at https://192.169.0.44:8444/healthz ...
	I0213 15:54:13.387534   10919 api_server.go:279] https://192.169.0.44:8444/healthz returned 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/start-service-ip-repair-controllers ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-system-namespaces-controller ok
	[+]poststarthook/bootstrap-controller ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-controller ok
	[+]poststarthook/start-deprecated-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-legacy-token-tracking-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	[+]poststarthook/apiservice-discovery-controller ok
	healthz check failed
	W0213 15:54:13.387551   10919 api_server.go:103] status: https://192.169.0.44:8444/healthz returned error 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/start-service-ip-repair-controllers ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-system-namespaces-controller ok
	[+]poststarthook/bootstrap-controller ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-controller ok
	[+]poststarthook/start-deprecated-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-legacy-token-tracking-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	[+]poststarthook/apiservice-discovery-controller ok
	healthz check failed
	I0213 15:54:13.879253   10919 api_server.go:253] Checking apiserver healthz at https://192.169.0.44:8444/healthz ...
	I0213 15:54:13.882740   10919 api_server.go:279] https://192.169.0.44:8444/healthz returned 200:
	ok
	I0213 15:54:13.889935   10919 api_server.go:141] control plane version: v1.28.4
	I0213 15:54:13.889954   10919 api_server.go:131] duration metric: took 4.010913173s to wait for apiserver health ...
	I0213 15:54:13.889963   10919 cni.go:84] Creating CNI manager for ""
	I0213 15:54:13.889973   10919 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I0213 15:54:13.912590   10919 out.go:177] * Configuring bridge CNI (Container Networking Interface) ...
	I0213 15:54:13.948179   10919 ssh_runner.go:195] Run: sudo mkdir -p /etc/cni/net.d
	I0213 15:54:13.962418   10919 ssh_runner.go:362] scp memory --> /etc/cni/net.d/1-k8s.conflist (457 bytes)
	I0213 15:54:14.009894   10919 system_pods.go:43] waiting for kube-system pods to appear ...
	I0213 15:54:14.016091   10919 system_pods.go:59] 8 kube-system pods found
	I0213 15:54:14.016111   10919 system_pods.go:61] "coredns-5dd5756b68-7gs8v" [4dc98c1d-8765-47ea-9752-6280bc1ebde6] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I0213 15:54:14.016117   10919 system_pods.go:61] "etcd-default-k8s-diff-port-603000" [9d7d4f7d-a734-42d5-bc35-ee865aeb2554] Running / Ready:ContainersNotReady (containers with unready status: [etcd]) / ContainersReady:ContainersNotReady (containers with unready status: [etcd])
	I0213 15:54:14.016126   10919 system_pods.go:61] "kube-apiserver-default-k8s-diff-port-603000" [c9ed8b20-609f-46fc-be78-c742486752de] Running / Ready:ContainersNotReady (containers with unready status: [kube-apiserver]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-apiserver])
	I0213 15:54:14.016133   10919 system_pods.go:61] "kube-controller-manager-default-k8s-diff-port-603000" [097bbce7-da9a-4750-b41d-5df3d8d30af2] Running / Ready:ContainersNotReady (containers with unready status: [kube-controller-manager]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-controller-manager])
	I0213 15:54:14.016137   10919 system_pods.go:61] "kube-proxy-jc5nj" [e59cad79-d49b-456a-8ad6-d7915ffab536] Running / Ready:ContainersNotReady (containers with unready status: [kube-proxy]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-proxy])
	I0213 15:54:14.016146   10919 system_pods.go:61] "kube-scheduler-default-k8s-diff-port-603000" [14ffc876-351f-4cff-9baa-f256677af78d] Running / Ready:ContainersNotReady (containers with unready status: [kube-scheduler]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-scheduler])
	I0213 15:54:14.016155   10919 system_pods.go:61] "metrics-server-57f55c9bc5-24wh6" [ada20ec6-8771-4dcf-bf09-c630c1ffac78] Pending / Ready:ContainersNotReady (containers with unready status: [metrics-server]) / ContainersReady:ContainersNotReady (containers with unready status: [metrics-server])
	I0213 15:54:14.016160   10919 system_pods.go:61] "storage-provisioner" [689fdbbd-483e-4345-95cb-c566fbbaf8d1] Running / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I0213 15:54:14.016165   10919 system_pods.go:74] duration metric: took 6.260768ms to wait for pod list to return data ...
	I0213 15:54:14.016171   10919 node_conditions.go:102] verifying NodePressure condition ...
	I0213 15:54:14.018312   10919 node_conditions.go:122] node storage ephemeral capacity is 17784752Ki
	I0213 15:54:14.018329   10919 node_conditions.go:123] node cpu capacity is 2
	I0213 15:54:14.018346   10919 node_conditions.go:105] duration metric: took 2.171633ms to run NodePressure ...
	I0213 15:54:14.018357   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.28.4:$PATH" kubeadm init phase addon all --config /var/tmp/minikube/kubeadm.yaml"
	I0213 15:54:14.248512   10919 kubeadm.go:772] waiting for restarted kubelet to initialise ...
	I0213 15:54:14.252486   10919 kubeadm.go:787] kubelet initialised
	I0213 15:54:14.252498   10919 kubeadm.go:788] duration metric: took 3.972519ms waiting for restarted kubelet to initialise ...
	I0213 15:54:14.252509   10919 pod_ready.go:35] extra waiting up to 4m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0213 15:54:14.256939   10919 pod_ready.go:78] waiting up to 4m0s for pod "coredns-5dd5756b68-7gs8v" in "kube-system" namespace to be "Ready" ...
	I0213 15:54:14.261215   10919 pod_ready.go:97] node "default-k8s-diff-port-603000" hosting pod "coredns-5dd5756b68-7gs8v" in "kube-system" namespace is currently not "Ready" (skipping!): node "default-k8s-diff-port-603000" has status "Ready":"False"
	I0213 15:54:14.261233   10919 pod_ready.go:81] duration metric: took 4.281981ms waiting for pod "coredns-5dd5756b68-7gs8v" in "kube-system" namespace to be "Ready" ...
	E0213 15:54:14.261240   10919 pod_ready.go:66] WaitExtra: waitPodCondition: node "default-k8s-diff-port-603000" hosting pod "coredns-5dd5756b68-7gs8v" in "kube-system" namespace is currently not "Ready" (skipping!): node "default-k8s-diff-port-603000" has status "Ready":"False"
	I0213 15:54:14.261248   10919 pod_ready.go:78] waiting up to 4m0s for pod "etcd-default-k8s-diff-port-603000" in "kube-system" namespace to be "Ready" ...
	I0213 15:54:14.265678   10919 pod_ready.go:97] node "default-k8s-diff-port-603000" hosting pod "etcd-default-k8s-diff-port-603000" in "kube-system" namespace is currently not "Ready" (skipping!): node "default-k8s-diff-port-603000" has status "Ready":"False"
	I0213 15:54:14.265690   10919 pod_ready.go:81] duration metric: took 4.435853ms waiting for pod "etcd-default-k8s-diff-port-603000" in "kube-system" namespace to be "Ready" ...
	E0213 15:54:14.265702   10919 pod_ready.go:66] WaitExtra: waitPodCondition: node "default-k8s-diff-port-603000" hosting pod "etcd-default-k8s-diff-port-603000" in "kube-system" namespace is currently not "Ready" (skipping!): node "default-k8s-diff-port-603000" has status "Ready":"False"
	I0213 15:54:14.265707   10919 pod_ready.go:78] waiting up to 4m0s for pod "kube-apiserver-default-k8s-diff-port-603000" in "kube-system" namespace to be "Ready" ...
	I0213 15:54:14.269552   10919 pod_ready.go:97] node "default-k8s-diff-port-603000" hosting pod "kube-apiserver-default-k8s-diff-port-603000" in "kube-system" namespace is currently not "Ready" (skipping!): node "default-k8s-diff-port-603000" has status "Ready":"False"
	I0213 15:54:14.269567   10919 pod_ready.go:81] duration metric: took 3.854228ms waiting for pod "kube-apiserver-default-k8s-diff-port-603000" in "kube-system" namespace to be "Ready" ...
	E0213 15:54:14.269575   10919 pod_ready.go:66] WaitExtra: waitPodCondition: node "default-k8s-diff-port-603000" hosting pod "kube-apiserver-default-k8s-diff-port-603000" in "kube-system" namespace is currently not "Ready" (skipping!): node "default-k8s-diff-port-603000" has status "Ready":"False"
	I0213 15:54:14.269580   10919 pod_ready.go:78] waiting up to 4m0s for pod "kube-controller-manager-default-k8s-diff-port-603000" in "kube-system" namespace to be "Ready" ...
	I0213 15:54:14.413611   10919 pod_ready.go:97] node "default-k8s-diff-port-603000" hosting pod "kube-controller-manager-default-k8s-diff-port-603000" in "kube-system" namespace is currently not "Ready" (skipping!): node "default-k8s-diff-port-603000" has status "Ready":"False"
	I0213 15:54:14.413626   10919 pod_ready.go:81] duration metric: took 144.036938ms waiting for pod "kube-controller-manager-default-k8s-diff-port-603000" in "kube-system" namespace to be "Ready" ...
	E0213 15:54:14.413635   10919 pod_ready.go:66] WaitExtra: waitPodCondition: node "default-k8s-diff-port-603000" hosting pod "kube-controller-manager-default-k8s-diff-port-603000" in "kube-system" namespace is currently not "Ready" (skipping!): node "default-k8s-diff-port-603000" has status "Ready":"False"
	I0213 15:54:14.413641   10919 pod_ready.go:78] waiting up to 4m0s for pod "kube-proxy-jc5nj" in "kube-system" namespace to be "Ready" ...
	I0213 15:54:14.813347   10919 pod_ready.go:97] node "default-k8s-diff-port-603000" hosting pod "kube-proxy-jc5nj" in "kube-system" namespace is currently not "Ready" (skipping!): node "default-k8s-diff-port-603000" has status "Ready":"False"
	I0213 15:54:14.813361   10919 pod_ready.go:81] duration metric: took 399.706684ms waiting for pod "kube-proxy-jc5nj" in "kube-system" namespace to be "Ready" ...
	E0213 15:54:14.813368   10919 pod_ready.go:66] WaitExtra: waitPodCondition: node "default-k8s-diff-port-603000" hosting pod "kube-proxy-jc5nj" in "kube-system" namespace is currently not "Ready" (skipping!): node "default-k8s-diff-port-603000" has status "Ready":"False"
	I0213 15:54:14.813374   10919 pod_ready.go:78] waiting up to 4m0s for pod "kube-scheduler-default-k8s-diff-port-603000" in "kube-system" namespace to be "Ready" ...
	I0213 15:54:15.212766   10919 pod_ready.go:97] node "default-k8s-diff-port-603000" hosting pod "kube-scheduler-default-k8s-diff-port-603000" in "kube-system" namespace is currently not "Ready" (skipping!): node "default-k8s-diff-port-603000" has status "Ready":"False"
	I0213 15:54:15.212779   10919 pod_ready.go:81] duration metric: took 399.389844ms waiting for pod "kube-scheduler-default-k8s-diff-port-603000" in "kube-system" namespace to be "Ready" ...
	E0213 15:54:15.212789   10919 pod_ready.go:66] WaitExtra: waitPodCondition: node "default-k8s-diff-port-603000" hosting pod "kube-scheduler-default-k8s-diff-port-603000" in "kube-system" namespace is currently not "Ready" (skipping!): node "default-k8s-diff-port-603000" has status "Ready":"False"
	I0213 15:54:15.212794   10919 pod_ready.go:78] waiting up to 4m0s for pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace to be "Ready" ...
	I0213 15:54:15.612545   10919 pod_ready.go:97] node "default-k8s-diff-port-603000" hosting pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace is currently not "Ready" (skipping!): node "default-k8s-diff-port-603000" has status "Ready":"False"
	I0213 15:54:15.612559   10919 pod_ready.go:81] duration metric: took 399.749728ms waiting for pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace to be "Ready" ...
	E0213 15:54:15.612567   10919 pod_ready.go:66] WaitExtra: waitPodCondition: node "default-k8s-diff-port-603000" hosting pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace is currently not "Ready" (skipping!): node "default-k8s-diff-port-603000" has status "Ready":"False"
	I0213 15:54:15.612573   10919 pod_ready.go:38] duration metric: took 1.360026027s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0213 15:54:15.612589   10919 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0213 15:54:15.621621   10919 ops.go:34] apiserver oom_adj: -16
	I0213 15:54:15.621635   10919 kubeadm.go:640] restartCluster took 18.289078206s
	I0213 15:54:15.621640   10919 kubeadm.go:406] StartCluster complete in 18.308205476s
	I0213 15:54:15.621651   10919 settings.go:142] acquiring lock: {Name:mk2b7626a62b7e77e2709adebde10f119ed0f449 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0213 15:54:15.621731   10919 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/18169-2790/kubeconfig
	I0213 15:54:15.622669   10919 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18169-2790/kubeconfig: {Name:mkf6bdf8196211b20577d90f94d0007015c44956 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0213 15:54:15.622943   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I0213 15:54:15.622971   10919 addons.go:502] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:true default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:true nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false storage-provisioner-rancher:false volumesnapshots:false yakd:false]
	I0213 15:54:15.623011   10919 addons.go:69] Setting storage-provisioner=true in profile "default-k8s-diff-port-603000"
	I0213 15:54:15.623018   10919 addons.go:69] Setting default-storageclass=true in profile "default-k8s-diff-port-603000"
	I0213 15:54:15.623024   10919 addons.go:234] Setting addon storage-provisioner=true in "default-k8s-diff-port-603000"
	W0213 15:54:15.623030   10919 addons.go:243] addon storage-provisioner should already be in state true
	I0213 15:54:15.623030   10919 addons.go:69] Setting metrics-server=true in profile "default-k8s-diff-port-603000"
	I0213 15:54:15.623047   10919 addons.go:234] Setting addon metrics-server=true in "default-k8s-diff-port-603000"
	W0213 15:54:15.623065   10919 addons.go:243] addon metrics-server should already be in state true
	I0213 15:54:15.623065   10919 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "default-k8s-diff-port-603000"
	I0213 15:54:15.623074   10919 host.go:66] Checking if "default-k8s-diff-port-603000" exists ...
	I0213 15:54:15.623088   10919 config.go:182] Loaded profile config "default-k8s-diff-port-603000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
	I0213 15:54:15.623101   10919 host.go:66] Checking if "default-k8s-diff-port-603000" exists ...
	I0213 15:54:15.623099   10919 addons.go:69] Setting dashboard=true in profile "default-k8s-diff-port-603000"
	I0213 15:54:15.623117   10919 addons.go:234] Setting addon dashboard=true in "default-k8s-diff-port-603000"
	W0213 15:54:15.623124   10919 addons.go:243] addon dashboard should already be in state true
	I0213 15:54:15.623174   10919 host.go:66] Checking if "default-k8s-diff-port-603000" exists ...
	I0213 15:54:15.623957   10919 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 15:54:15.623995   10919 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 15:54:15.624032   10919 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 15:54:15.624091   10919 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 15:54:15.624111   10919 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 15:54:15.624338   10919 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 15:54:15.625460   10919 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 15:54:15.625656   10919 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 15:54:15.634832   10919 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:57445
	I0213 15:54:15.635252   10919 main.go:141] libmachine: () Calling .GetVersion
	I0213 15:54:15.635646   10919 main.go:141] libmachine: Using API Version  1
	I0213 15:54:15.635660   10919 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 15:54:15.635938   10919 main.go:141] libmachine: () Calling .GetMachineName
	I0213 15:54:15.636161   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetState
	I0213 15:54:15.636232   10919 kapi.go:248] "coredns" deployment in "kube-system" namespace and "default-k8s-diff-port-603000" context rescaled to 1 replicas
	I0213 15:54:15.636255   10919 start.go:223] Will wait 6m0s for node &{Name: IP:192.169.0.44 Port:8444 KubernetesVersion:v1.28.4 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0213 15:54:15.658032   10919 out.go:177] * Verifying Kubernetes components...
	I0213 15:54:15.636312   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 15:54:15.637178   10919 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:57447
	I0213 15:54:15.699039   10919 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0213 15:54:15.638611   10919 addons.go:234] Setting addon default-storageclass=true in "default-k8s-diff-port-603000"
	W0213 15:54:15.699075   10919 addons.go:243] addon default-storageclass should already be in state true
	I0213 15:54:15.639646   10919 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:57448
	I0213 15:54:15.699107   10919 host.go:66] Checking if "default-k8s-diff-port-603000" exists ...
	I0213 15:54:15.640503   10919 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:57449
	I0213 15:54:15.658067   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | hyperkit pid from json: 10930
	I0213 15:54:15.699595   10919 main.go:141] libmachine: () Calling .GetVersion
	I0213 15:54:15.699698   10919 main.go:141] libmachine: () Calling .GetVersion
	I0213 15:54:15.699747   10919 main.go:141] libmachine: () Calling .GetVersion
	I0213 15:54:15.699759   10919 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 15:54:15.699795   10919 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 15:54:15.700914   10919 main.go:141] libmachine: Using API Version  1
	I0213 15:54:15.701100   10919 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 15:54:15.701492   10919 main.go:141] libmachine: Using API Version  1
	I0213 15:54:15.701503   10919 main.go:141] libmachine: Using API Version  1
	I0213 15:54:15.701520   10919 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 15:54:15.701524   10919 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 15:54:15.701723   10919 main.go:141] libmachine: () Calling .GetMachineName
	I0213 15:54:15.701867   10919 main.go:141] libmachine: () Calling .GetMachineName
	I0213 15:54:15.701939   10919 main.go:141] libmachine: () Calling .GetMachineName
	I0213 15:54:15.702790   10919 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 15:54:15.703348   10919 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 15:54:15.703515   10919 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 15:54:15.703531   10919 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 15:54:15.703555   10919 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 15:54:15.703611   10919 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 15:54:15.712515   10919 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:57453
	I0213 15:54:15.713151   10919 main.go:141] libmachine: () Calling .GetVersion
	I0213 15:54:15.713659   10919 main.go:141] libmachine: Using API Version  1
	I0213 15:54:15.713674   10919 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 15:54:15.714006   10919 main.go:141] libmachine: () Calling .GetMachineName
	I0213 15:54:15.714607   10919 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 15:54:15.714635   10919 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 15:54:15.715520   10919 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:57455
	I0213 15:54:15.717630   10919 main.go:141] libmachine: () Calling .GetVersion
	I0213 15:54:15.716788   10919 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:57457
	I0213 15:54:15.718041   10919 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:57458
	I0213 15:54:15.718255   10919 main.go:141] libmachine: Using API Version  1
	I0213 15:54:15.718275   10919 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 15:54:15.718448   10919 main.go:141] libmachine: () Calling .GetVersion
	I0213 15:54:15.718489   10919 main.go:141] libmachine: () Calling .GetVersion
	I0213 15:54:15.718545   10919 main.go:141] libmachine: () Calling .GetMachineName
	I0213 15:54:15.718682   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetState
	I0213 15:54:15.718864   10919 main.go:141] libmachine: Using API Version  1
	I0213 15:54:15.718883   10919 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 15:54:15.718863   10919 main.go:141] libmachine: Using API Version  1
	I0213 15:54:15.718896   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 15:54:15.718904   10919 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 15:54:15.718907   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | hyperkit pid from json: 10930
	I0213 15:54:15.719141   10919 main.go:141] libmachine: () Calling .GetMachineName
	I0213 15:54:15.719168   10919 main.go:141] libmachine: () Calling .GetMachineName
	I0213 15:54:15.719344   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetState
	I0213 15:54:15.719375   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetState
	I0213 15:54:15.719477   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 15:54:15.719494   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 15:54:15.719571   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | hyperkit pid from json: 10930
	I0213 15:54:15.719592   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | hyperkit pid from json: 10930
	I0213 15:54:15.720114   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .DriverName
	I0213 15:54:15.741194   10919 out.go:177]   - Using image fake.domain/registry.k8s.io/echoserver:1.4
	I0213 15:54:15.720598   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .DriverName
	I0213 15:54:15.720660   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .DriverName
	I0213 15:54:15.724118   10919 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:57461
	I0213 15:54:15.762239   10919 addons.go:426] installing /etc/kubernetes/addons/metrics-apiservice.yaml
	I0213 15:54:15.762251   10919 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/metrics-apiservice.yaml (424 bytes)
	I0213 15:54:15.782982   10919 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I0213 15:54:15.762269   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHHostname
	I0213 15:54:15.762678   10919 main.go:141] libmachine: () Calling .GetVersion
	I0213 15:54:15.771869   10919 node_ready.go:35] waiting up to 6m0s for node "default-k8s-diff-port-603000" to be "Ready" ...
	I0213 15:54:15.772056   10919 start.go:902] CoreDNS already contains "host.minikube.internal" host record, skipping...
	I0213 15:54:15.804421   10919 addons.go:426] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I0213 15:54:15.804536   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHPort
	I0213 15:54:15.825164   10919 out.go:177]   - Using image docker.io/kubernetesui/dashboard:v2.7.0
	I0213 15:54:15.825187   10919 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I0213 15:54:15.825312   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHKeyPath
	I0213 15:54:15.825686   10919 main.go:141] libmachine: Using API Version  1
	I0213 15:54:15.846043   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHHostname
	I0213 15:54:15.867181   10919 out.go:177]   - Using image registry.k8s.io/echoserver:1.4
	I0213 15:54:15.867200   10919 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 15:54:15.888251   10919 addons.go:426] installing /etc/kubernetes/addons/dashboard-ns.yaml
	I0213 15:54:15.888263   10919 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-ns.yaml (759 bytes)
	I0213 15:54:15.867339   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHUsername
	I0213 15:54:15.888276   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHHostname
	I0213 15:54:15.867350   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHPort
	I0213 15:54:15.888477   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHPort
	I0213 15:54:15.888521   10919 sshutil.go:53] new ssh client: &{IP:192.169.0.44 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000/id_rsa Username:docker}
	I0213 15:54:15.888543   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHKeyPath
	I0213 15:54:15.888653   10919 main.go:141] libmachine: () Calling .GetMachineName
	I0213 15:54:15.888696   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHKeyPath
	I0213 15:54:15.888736   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHUsername
	I0213 15:54:15.888820   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetState
	I0213 15:54:15.888882   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHUsername
	I0213 15:54:15.888888   10919 sshutil.go:53] new ssh client: &{IP:192.169.0.44 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000/id_rsa Username:docker}
	I0213 15:54:15.888954   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 15:54:15.889054   10919 sshutil.go:53] new ssh client: &{IP:192.169.0.44 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000/id_rsa Username:docker}
	I0213 15:54:15.889067   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | hyperkit pid from json: 10930
	I0213 15:54:15.890389   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .DriverName
	I0213 15:54:15.890554   10919 addons.go:426] installing /etc/kubernetes/addons/storageclass.yaml
	I0213 15:54:15.890563   10919 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I0213 15:54:15.890571   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHHostname
	I0213 15:54:15.890672   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHPort
	I0213 15:54:15.890783   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHKeyPath
	I0213 15:54:15.890873   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .GetSSHUsername
	I0213 15:54:15.890958   10919 sshutil.go:53] new ssh client: &{IP:192.169.0.44 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/default-k8s-diff-port-603000/id_rsa Username:docker}
	I0213 15:54:15.940615   10919 addons.go:426] installing /etc/kubernetes/addons/metrics-server-deployment.yaml
	I0213 15:54:15.940627   10919 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/metrics-server-deployment.yaml (1825 bytes)
	I0213 15:54:15.952776   10919 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I0213 15:54:15.957539   10919 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I0213 15:54:15.963827   10919 addons.go:426] installing /etc/kubernetes/addons/metrics-server-rbac.yaml
	I0213 15:54:15.963840   10919 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/metrics-server-rbac.yaml (2175 bytes)
	I0213 15:54:15.969589   10919 addons.go:426] installing /etc/kubernetes/addons/dashboard-clusterrole.yaml
	I0213 15:54:15.969599   10919 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-clusterrole.yaml (1001 bytes)
	I0213 15:54:15.983913   10919 addons.go:426] installing /etc/kubernetes/addons/metrics-server-service.yaml
	I0213 15:54:15.983926   10919 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/metrics-server-service.yaml (446 bytes)
	I0213 15:54:16.014328   10919 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/metrics-apiservice.yaml -f /etc/kubernetes/addons/metrics-server-deployment.yaml -f /etc/kubernetes/addons/metrics-server-rbac.yaml -f /etc/kubernetes/addons/metrics-server-service.yaml
	I0213 15:54:16.024221   10919 addons.go:426] installing /etc/kubernetes/addons/dashboard-clusterrolebinding.yaml
	I0213 15:54:16.024241   10919 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-clusterrolebinding.yaml (1018 bytes)
	I0213 15:54:16.062319   10919 addons.go:426] installing /etc/kubernetes/addons/dashboard-configmap.yaml
	I0213 15:54:16.062333   10919 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-configmap.yaml (837 bytes)
	I0213 15:54:16.102936   10919 addons.go:426] installing /etc/kubernetes/addons/dashboard-dp.yaml
	I0213 15:54:16.102949   10919 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-dp.yaml (4201 bytes)
	I0213 15:54:16.170198   10919 addons.go:426] installing /etc/kubernetes/addons/dashboard-role.yaml
	I0213 15:54:16.170214   10919 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-role.yaml (1724 bytes)
	I0213 15:54:16.206233   10919 addons.go:426] installing /etc/kubernetes/addons/dashboard-rolebinding.yaml
	I0213 15:54:16.206250   10919 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-rolebinding.yaml (1046 bytes)
	I0213 15:54:16.219189   10919 addons.go:426] installing /etc/kubernetes/addons/dashboard-sa.yaml
	I0213 15:54:16.219203   10919 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-sa.yaml (837 bytes)
	I0213 15:54:16.231753   10919 addons.go:426] installing /etc/kubernetes/addons/dashboard-secret.yaml
	I0213 15:54:16.231765   10919 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-secret.yaml (1389 bytes)
	I0213 15:54:16.244274   10919 addons.go:426] installing /etc/kubernetes/addons/dashboard-svc.yaml
	I0213 15:54:16.244287   10919 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-svc.yaml (1294 bytes)
	I0213 15:54:16.257541   10919 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/dashboard-ns.yaml -f /etc/kubernetes/addons/dashboard-clusterrole.yaml -f /etc/kubernetes/addons/dashboard-clusterrolebinding.yaml -f /etc/kubernetes/addons/dashboard-configmap.yaml -f /etc/kubernetes/addons/dashboard-dp.yaml -f /etc/kubernetes/addons/dashboard-role.yaml -f /etc/kubernetes/addons/dashboard-rolebinding.yaml -f /etc/kubernetes/addons/dashboard-sa.yaml -f /etc/kubernetes/addons/dashboard-secret.yaml -f /etc/kubernetes/addons/dashboard-svc.yaml
	I0213 15:54:17.083242   10919 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: (1.130420758s)
	I0213 15:54:17.083267   10919 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: (1.125688135s)
	I0213 15:54:17.083283   10919 main.go:141] libmachine: Making call to close driver server
	I0213 15:54:17.083289   10919 main.go:141] libmachine: Making call to close driver server
	I0213 15:54:17.083293   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .Close
	I0213 15:54:17.083296   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .Close
	I0213 15:54:17.083449   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | Closing plugin on server side
	I0213 15:54:17.083451   10919 main.go:141] libmachine: Successfully made call to close driver server
	I0213 15:54:17.083465   10919 main.go:141] libmachine: Making call to close connection to plugin binary
	I0213 15:54:17.083472   10919 main.go:141] libmachine: Successfully made call to close driver server
	I0213 15:54:17.083474   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | Closing plugin on server side
	I0213 15:54:17.083480   10919 main.go:141] libmachine: Making call to close connection to plugin binary
	I0213 15:54:17.083485   10919 main.go:141] libmachine: Making call to close driver server
	I0213 15:54:17.083491   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .Close
	I0213 15:54:17.083474   10919 main.go:141] libmachine: Making call to close driver server
	I0213 15:54:17.083508   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .Close
	I0213 15:54:17.083602   10919 main.go:141] libmachine: Successfully made call to close driver server
	I0213 15:54:17.083614   10919 main.go:141] libmachine: Making call to close connection to plugin binary
	I0213 15:54:17.083637   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | Closing plugin on server side
	I0213 15:54:17.083663   10919 main.go:141] libmachine: Successfully made call to close driver server
	I0213 15:54:17.083675   10919 main.go:141] libmachine: Making call to close connection to plugin binary
	I0213 15:54:17.087984   10919 main.go:141] libmachine: Making call to close driver server
	I0213 15:54:17.087996   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .Close
	I0213 15:54:17.088131   10919 main.go:141] libmachine: Successfully made call to close driver server
	I0213 15:54:17.088140   10919 main.go:141] libmachine: Making call to close connection to plugin binary
	I0213 15:54:17.088154   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | Closing plugin on server side
	I0213 15:54:17.188254   10919 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/metrics-apiservice.yaml -f /etc/kubernetes/addons/metrics-server-deployment.yaml -f /etc/kubernetes/addons/metrics-server-rbac.yaml -f /etc/kubernetes/addons/metrics-server-service.yaml: (1.173878914s)
	I0213 15:54:17.188290   10919 main.go:141] libmachine: Making call to close driver server
	I0213 15:54:17.188301   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .Close
	I0213 15:54:17.188460   10919 main.go:141] libmachine: Successfully made call to close driver server
	I0213 15:54:17.188478   10919 main.go:141] libmachine: Making call to close connection to plugin binary
	I0213 15:54:17.188482   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | Closing plugin on server side
	I0213 15:54:17.188485   10919 main.go:141] libmachine: Making call to close driver server
	I0213 15:54:17.188491   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .Close
	I0213 15:54:17.188616   10919 main.go:141] libmachine: Successfully made call to close driver server
	I0213 15:54:17.188626   10919 main.go:141] libmachine: Making call to close connection to plugin binary
	I0213 15:54:17.188627   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | Closing plugin on server side
	I0213 15:54:17.188631   10919 addons.go:470] Verifying addon metrics-server=true in "default-k8s-diff-port-603000"
	I0213 15:54:17.464988   10919 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/dashboard-ns.yaml -f /etc/kubernetes/addons/dashboard-clusterrole.yaml -f /etc/kubernetes/addons/dashboard-clusterrolebinding.yaml -f /etc/kubernetes/addons/dashboard-configmap.yaml -f /etc/kubernetes/addons/dashboard-dp.yaml -f /etc/kubernetes/addons/dashboard-role.yaml -f /etc/kubernetes/addons/dashboard-rolebinding.yaml -f /etc/kubernetes/addons/dashboard-sa.yaml -f /etc/kubernetes/addons/dashboard-secret.yaml -f /etc/kubernetes/addons/dashboard-svc.yaml: (1.2073895s)
	I0213 15:54:17.465014   10919 main.go:141] libmachine: Making call to close driver server
	I0213 15:54:17.465023   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .Close
	I0213 15:54:17.465164   10919 main.go:141] libmachine: Successfully made call to close driver server
	I0213 15:54:17.465174   10919 main.go:141] libmachine: Making call to close connection to plugin binary
	I0213 15:54:17.465186   10919 main.go:141] libmachine: Making call to close driver server
	I0213 15:54:17.465189   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | Closing plugin on server side
	I0213 15:54:17.465197   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) Calling .Close
	I0213 15:54:17.465320   10919 main.go:141] libmachine: Successfully made call to close driver server
	I0213 15:54:17.465324   10919 main.go:141] libmachine: (default-k8s-diff-port-603000) DBG | Closing plugin on server side
	I0213 15:54:17.465332   10919 main.go:141] libmachine: Making call to close connection to plugin binary
	I0213 15:54:17.488011   10919 out.go:177] * Some dashboard features require the metrics-server addon. To enable all features please run:
	
		minikube -p default-k8s-diff-port-603000 addons enable metrics-server
	
	I0213 15:54:17.507903   10919 out.go:177] * Enabled addons: storage-provisioner, default-storageclass, metrics-server, dashboard
	I0213 15:54:17.528765   10919 addons.go:505] enable addons completed in 1.905760644s: enabled=[storage-provisioner default-storageclass metrics-server dashboard]
	I0213 15:54:17.828649   10919 node_ready.go:58] node "default-k8s-diff-port-603000" has status "Ready":"False"
	I0213 15:54:20.329717   10919 node_ready.go:58] node "default-k8s-diff-port-603000" has status "Ready":"False"
	I0213 15:54:22.827847   10919 node_ready.go:58] node "default-k8s-diff-port-603000" has status "Ready":"False"
	I0213 15:54:23.329034   10919 node_ready.go:49] node "default-k8s-diff-port-603000" has status "Ready":"True"
	I0213 15:54:23.329050   10919 node_ready.go:38] duration metric: took 7.50372839s waiting for node "default-k8s-diff-port-603000" to be "Ready" ...
	I0213 15:54:23.329056   10919 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0213 15:54:23.333040   10919 pod_ready.go:78] waiting up to 6m0s for pod "coredns-5dd5756b68-7gs8v" in "kube-system" namespace to be "Ready" ...
	I0213 15:54:23.336358   10919 pod_ready.go:92] pod "coredns-5dd5756b68-7gs8v" in "kube-system" namespace has status "Ready":"True"
	I0213 15:54:23.336370   10919 pod_ready.go:81] duration metric: took 3.319226ms waiting for pod "coredns-5dd5756b68-7gs8v" in "kube-system" namespace to be "Ready" ...
	I0213 15:54:23.336376   10919 pod_ready.go:78] waiting up to 6m0s for pod "etcd-default-k8s-diff-port-603000" in "kube-system" namespace to be "Ready" ...
	I0213 15:54:23.339676   10919 pod_ready.go:92] pod "etcd-default-k8s-diff-port-603000" in "kube-system" namespace has status "Ready":"True"
	I0213 15:54:23.339685   10919 pod_ready.go:81] duration metric: took 3.304162ms waiting for pod "etcd-default-k8s-diff-port-603000" in "kube-system" namespace to be "Ready" ...
	I0213 15:54:23.339693   10919 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-default-k8s-diff-port-603000" in "kube-system" namespace to be "Ready" ...
	I0213 15:54:23.843871   10919 pod_ready.go:92] pod "kube-apiserver-default-k8s-diff-port-603000" in "kube-system" namespace has status "Ready":"True"
	I0213 15:54:23.843884   10919 pod_ready.go:81] duration metric: took 504.174443ms waiting for pod "kube-apiserver-default-k8s-diff-port-603000" in "kube-system" namespace to be "Ready" ...
	I0213 15:54:23.843891   10919 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-default-k8s-diff-port-603000" in "kube-system" namespace to be "Ready" ...
	I0213 15:54:24.349110   10919 pod_ready.go:92] pod "kube-controller-manager-default-k8s-diff-port-603000" in "kube-system" namespace has status "Ready":"True"
	I0213 15:54:24.349121   10919 pod_ready.go:81] duration metric: took 505.21284ms waiting for pod "kube-controller-manager-default-k8s-diff-port-603000" in "kube-system" namespace to be "Ready" ...
	I0213 15:54:24.349128   10919 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-jc5nj" in "kube-system" namespace to be "Ready" ...
	I0213 15:54:24.529150   10919 pod_ready.go:92] pod "kube-proxy-jc5nj" in "kube-system" namespace has status "Ready":"True"
	I0213 15:54:24.529161   10919 pod_ready.go:81] duration metric: took 180.024734ms waiting for pod "kube-proxy-jc5nj" in "kube-system" namespace to be "Ready" ...
	I0213 15:54:24.529168   10919 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-default-k8s-diff-port-603000" in "kube-system" namespace to be "Ready" ...
	I0213 15:54:24.930189   10919 pod_ready.go:92] pod "kube-scheduler-default-k8s-diff-port-603000" in "kube-system" namespace has status "Ready":"True"
	I0213 15:54:24.930202   10919 pod_ready.go:81] duration metric: took 401.019668ms waiting for pod "kube-scheduler-default-k8s-diff-port-603000" in "kube-system" namespace to be "Ready" ...
	I0213 15:54:24.930210   10919 pod_ready.go:78] waiting up to 6m0s for pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace to be "Ready" ...
	I0213 15:54:26.934373   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:54:28.937088   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:54:31.434441   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:54:33.434828   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:54:35.435928   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:54:37.934233   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:54:39.936850   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:54:41.942239   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:54:44.435037   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:54:46.435335   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:54:48.436335   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:54:50.937052   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:54:53.434345   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:54:55.435291   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:54:57.435408   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:54:59.437129   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:55:01.437537   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:55:03.936448   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:55:05.936846   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:55:08.435085   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:55:10.436600   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:55:12.936058   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:55:14.937171   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:55:16.937379   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:55:19.438525   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:55:21.936955   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:55:23.937692   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:55:26.437707   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:55:28.935735   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:55:30.938252   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:55:33.437927   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:55:35.938283   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:55:38.437034   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:55:40.938568   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:55:42.938781   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:55:45.436520   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:55:47.936962   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:55:49.938211   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:55:52.437955   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:55:54.936982   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:55:56.937610   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:55:59.437019   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:56:01.438858   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:56:03.939085   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:56:05.939174   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:56:08.438390   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:56:10.438596   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:56:12.938288   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:56:15.437699   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:56:17.438655   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:56:19.939529   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:56:22.438448   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:56:24.938078   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:56:26.938424   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:56:29.438825   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:56:31.938848   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:56:34.438829   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:56:36.940308   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:56:38.940570   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:56:41.438413   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:56:43.439358   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:56:45.939566   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:56:48.438035   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:56:50.438868   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:56:52.937991   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:56:54.938071   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:56:56.940029   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:56:59.439639   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:57:01.439740   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:57:03.939308   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:57:06.440152   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:57:08.939567   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:57:10.940126   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:57:13.438241   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:57:15.440273   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:57:17.940474   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:57:20.439696   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:57:22.440752   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:57:24.939272   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:57:27.440465   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:57:29.940259   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:57:31.940555   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:57:34.440001   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:57:36.440431   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:57:38.939597   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:57:40.940254   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:57:42.941037   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:57:45.441545   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:57:47.943329   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:57:50.440837   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:57:52.940784   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:57:55.439840   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:57:57.440032   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:57:59.440965   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:58:01.941402   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:58:04.440297   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:58:06.940053   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:58:08.941389   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:58:11.440895   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:58:13.442387   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:58:15.941513   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:58:18.440183   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:58:20.442291   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:58:22.941441   10919 pod_ready.go:102] pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace has status "Ready":"False"
	I0213 15:58:24.941359   10919 pod_ready.go:81] duration metric: took 4m0.004975787s waiting for pod "metrics-server-57f55c9bc5-24wh6" in "kube-system" namespace to be "Ready" ...
	E0213 15:58:24.941373   10919 pod_ready.go:66] WaitExtra: waitPodCondition: context deadline exceeded
	I0213 15:58:24.941378   10919 pod_ready.go:38] duration metric: took 4m1.606113814s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0213 15:58:24.941391   10919 api_server.go:52] waiting for apiserver process to appear ...
	I0213 15:58:24.941476   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-apiserver --format={{.ID}}
	I0213 15:58:24.955751   10919 logs.go:276] 2 containers: [67b5fd979937 69aac7de960e]
	I0213 15:58:24.955830   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_etcd --format={{.ID}}
	I0213 15:58:24.970231   10919 logs.go:276] 2 containers: [8e7930e1af27 33e10bdbe674]
	I0213 15:58:24.970309   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_coredns --format={{.ID}}
	I0213 15:58:24.984160   10919 logs.go:276] 2 containers: [dde67a3d6462 cf10ab2a5821]
	I0213 15:58:24.984241   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-scheduler --format={{.ID}}
	I0213 15:58:24.998206   10919 logs.go:276] 2 containers: [4a32e0095c50 7516f7e0a7b3]
	I0213 15:58:24.998281   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-proxy --format={{.ID}}
	I0213 15:58:25.011700   10919 logs.go:276] 2 containers: [3a2c33446232 166517917ebe]
	I0213 15:58:25.011778   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-controller-manager --format={{.ID}}
	I0213 15:58:25.026061   10919 logs.go:276] 2 containers: [b2bc8bfc1796 f753753d8cc1]
	I0213 15:58:25.026137   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kindnet --format={{.ID}}
	I0213 15:58:25.040315   10919 logs.go:276] 0 containers: []
	W0213 15:58:25.040327   10919 logs.go:278] No container was found matching "kindnet"
	I0213 15:58:25.040392   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kubernetes-dashboard --format={{.ID}}
	I0213 15:58:25.054674   10919 logs.go:276] 1 containers: [9ad73b9e5c82]
	I0213 15:58:25.054757   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_storage-provisioner --format={{.ID}}
	I0213 15:58:25.068946   10919 logs.go:276] 2 containers: [35631bb46ae0 d2fb13268178]
	I0213 15:58:25.068967   10919 logs.go:123] Gathering logs for kube-apiserver [69aac7de960e] ...
	I0213 15:58:25.068975   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 69aac7de960e"
	I0213 15:58:25.105517   10919 logs.go:123] Gathering logs for etcd [33e10bdbe674] ...
	I0213 15:58:25.105539   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 33e10bdbe674"
	I0213 15:58:25.126748   10919 logs.go:123] Gathering logs for storage-provisioner [d2fb13268178] ...
	I0213 15:58:25.126764   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 d2fb13268178"
	I0213 15:58:25.145818   10919 logs.go:123] Gathering logs for kube-proxy [166517917ebe] ...
	I0213 15:58:25.145833   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 166517917ebe"
	I0213 15:58:25.162222   10919 logs.go:123] Gathering logs for kube-controller-manager [b2bc8bfc1796] ...
	I0213 15:58:25.162237   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 b2bc8bfc1796"
	I0213 15:58:25.189824   10919 logs.go:123] Gathering logs for kube-controller-manager [f753753d8cc1] ...
	I0213 15:58:25.189842   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 f753753d8cc1"
	I0213 15:58:25.215933   10919 logs.go:123] Gathering logs for describe nodes ...
	I0213 15:58:25.215951   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I0213 15:58:25.306084   10919 logs.go:123] Gathering logs for kube-scheduler [4a32e0095c50] ...
	I0213 15:58:25.306101   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 4a32e0095c50"
	I0213 15:58:25.322374   10919 logs.go:123] Gathering logs for kube-proxy [3a2c33446232] ...
	I0213 15:58:25.322389   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3a2c33446232"
	I0213 15:58:25.339176   10919 logs.go:123] Gathering logs for storage-provisioner [35631bb46ae0] ...
	I0213 15:58:25.339191   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 35631bb46ae0"
	I0213 15:58:25.355043   10919 logs.go:123] Gathering logs for Docker ...
	I0213 15:58:25.355062   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u docker -u cri-docker -n 400"
	I0213 15:58:25.395135   10919 logs.go:123] Gathering logs for kube-apiserver [67b5fd979937] ...
	I0213 15:58:25.395156   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 67b5fd979937"
	I0213 15:58:25.419539   10919 logs.go:123] Gathering logs for coredns [dde67a3d6462] ...
	I0213 15:58:25.419557   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 dde67a3d6462"
	I0213 15:58:25.435322   10919 logs.go:123] Gathering logs for kubernetes-dashboard [9ad73b9e5c82] ...
	I0213 15:58:25.435341   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 9ad73b9e5c82"
	I0213 15:58:25.461447   10919 logs.go:123] Gathering logs for coredns [cf10ab2a5821] ...
	I0213 15:58:25.461462   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 cf10ab2a5821"
	I0213 15:58:25.478674   10919 logs.go:123] Gathering logs for kube-scheduler [7516f7e0a7b3] ...
	I0213 15:58:25.478689   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 7516f7e0a7b3"
	I0213 15:58:25.499537   10919 logs.go:123] Gathering logs for container status ...
	I0213 15:58:25.499552   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I0213 15:58:25.553665   10919 logs.go:123] Gathering logs for kubelet ...
	I0213 15:58:25.553681   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I0213 15:58:25.599893   10919 logs.go:123] Gathering logs for dmesg ...
	I0213 15:58:25.599913   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I0213 15:58:25.610663   10919 logs.go:123] Gathering logs for etcd [8e7930e1af27] ...
	I0213 15:58:25.610678   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 8e7930e1af27"
	I0213 15:58:28.134030   10919 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0213 15:58:28.143688   10919 api_server.go:72] duration metric: took 4m12.50094823s to wait for apiserver process to appear ...
	I0213 15:58:28.143699   10919 api_server.go:88] waiting for apiserver healthz status ...
	I0213 15:58:28.143772   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-apiserver --format={{.ID}}
	I0213 15:58:28.157962   10919 logs.go:276] 2 containers: [67b5fd979937 69aac7de960e]
	I0213 15:58:28.158038   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_etcd --format={{.ID}}
	I0213 15:58:28.170483   10919 logs.go:276] 2 containers: [8e7930e1af27 33e10bdbe674]
	I0213 15:58:28.170557   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_coredns --format={{.ID}}
	I0213 15:58:28.183578   10919 logs.go:276] 2 containers: [dde67a3d6462 cf10ab2a5821]
	I0213 15:58:28.183651   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-scheduler --format={{.ID}}
	I0213 15:58:28.196627   10919 logs.go:276] 2 containers: [4a32e0095c50 7516f7e0a7b3]
	I0213 15:58:28.196702   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-proxy --format={{.ID}}
	I0213 15:58:28.209245   10919 logs.go:276] 2 containers: [3a2c33446232 166517917ebe]
	I0213 15:58:28.209328   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-controller-manager --format={{.ID}}
	I0213 15:58:28.223121   10919 logs.go:276] 2 containers: [b2bc8bfc1796 f753753d8cc1]
	I0213 15:58:28.223195   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kindnet --format={{.ID}}
	I0213 15:58:28.235823   10919 logs.go:276] 0 containers: []
	W0213 15:58:28.235836   10919 logs.go:278] No container was found matching "kindnet"
	I0213 15:58:28.235902   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kubernetes-dashboard --format={{.ID}}
	I0213 15:58:28.249156   10919 logs.go:276] 1 containers: [9ad73b9e5c82]
	I0213 15:58:28.249228   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_storage-provisioner --format={{.ID}}
	I0213 15:58:28.262218   10919 logs.go:276] 2 containers: [35631bb46ae0 d2fb13268178]
	I0213 15:58:28.262237   10919 logs.go:123] Gathering logs for kube-apiserver [69aac7de960e] ...
	I0213 15:58:28.262243   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 69aac7de960e"
	I0213 15:58:28.296760   10919 logs.go:123] Gathering logs for coredns [dde67a3d6462] ...
	I0213 15:58:28.296774   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 dde67a3d6462"
	I0213 15:58:28.311479   10919 logs.go:123] Gathering logs for kube-scheduler [4a32e0095c50] ...
	I0213 15:58:28.311494   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 4a32e0095c50"
	I0213 15:58:28.327525   10919 logs.go:123] Gathering logs for kube-controller-manager [f753753d8cc1] ...
	I0213 15:58:28.327539   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 f753753d8cc1"
	I0213 15:58:28.353671   10919 logs.go:123] Gathering logs for storage-provisioner [35631bb46ae0] ...
	I0213 15:58:28.353693   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 35631bb46ae0"
	I0213 15:58:28.369809   10919 logs.go:123] Gathering logs for kubelet ...
	I0213 15:58:28.369824   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I0213 15:58:28.414836   10919 logs.go:123] Gathering logs for kubernetes-dashboard [9ad73b9e5c82] ...
	I0213 15:58:28.414852   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 9ad73b9e5c82"
	I0213 15:58:28.430836   10919 logs.go:123] Gathering logs for container status ...
	I0213 15:58:28.430850   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I0213 15:58:28.478461   10919 logs.go:123] Gathering logs for kube-proxy [166517917ebe] ...
	I0213 15:58:28.478475   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 166517917ebe"
	I0213 15:58:28.494126   10919 logs.go:123] Gathering logs for describe nodes ...
	I0213 15:58:28.494144   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I0213 15:58:28.604223   10919 logs.go:123] Gathering logs for kube-apiserver [67b5fd979937] ...
	I0213 15:58:28.604239   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 67b5fd979937"
	I0213 15:58:28.631249   10919 logs.go:123] Gathering logs for etcd [8e7930e1af27] ...
	I0213 15:58:28.631265   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 8e7930e1af27"
	I0213 15:58:28.650821   10919 logs.go:123] Gathering logs for etcd [33e10bdbe674] ...
	I0213 15:58:28.650838   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 33e10bdbe674"
	I0213 15:58:28.670437   10919 logs.go:123] Gathering logs for kube-proxy [3a2c33446232] ...
	I0213 15:58:28.670452   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3a2c33446232"
	I0213 15:58:28.685985   10919 logs.go:123] Gathering logs for storage-provisioner [d2fb13268178] ...
	I0213 15:58:28.686001   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 d2fb13268178"
	I0213 15:58:28.700560   10919 logs.go:123] Gathering logs for Docker ...
	I0213 15:58:28.700574   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u docker -u cri-docker -n 400"
	I0213 15:58:28.738398   10919 logs.go:123] Gathering logs for dmesg ...
	I0213 15:58:28.738414   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I0213 15:58:28.748710   10919 logs.go:123] Gathering logs for kube-scheduler [7516f7e0a7b3] ...
	I0213 15:58:28.748723   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 7516f7e0a7b3"
	I0213 15:58:28.769266   10919 logs.go:123] Gathering logs for kube-controller-manager [b2bc8bfc1796] ...
	I0213 15:58:28.769279   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 b2bc8bfc1796"
	I0213 15:58:28.795715   10919 logs.go:123] Gathering logs for coredns [cf10ab2a5821] ...
	I0213 15:58:28.795729   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 cf10ab2a5821"
	I0213 15:58:31.311325   10919 api_server.go:253] Checking apiserver healthz at https://192.169.0.44:8444/healthz ...
	I0213 15:58:36.311918   10919 api_server.go:269] stopped: https://192.169.0.44:8444/healthz: Get "https://192.169.0.44:8444/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I0213 15:58:36.312054   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-apiserver --format={{.ID}}
	I0213 15:58:36.326126   10919 logs.go:276] 2 containers: [67b5fd979937 69aac7de960e]
	I0213 15:58:36.326202   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_etcd --format={{.ID}}
	I0213 15:58:36.340319   10919 logs.go:276] 2 containers: [8e7930e1af27 33e10bdbe674]
	I0213 15:58:36.340395   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_coredns --format={{.ID}}
	I0213 15:58:36.353665   10919 logs.go:276] 2 containers: [dde67a3d6462 cf10ab2a5821]
	I0213 15:58:36.353744   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-scheduler --format={{.ID}}
	I0213 15:58:36.366256   10919 logs.go:276] 2 containers: [4a32e0095c50 7516f7e0a7b3]
	I0213 15:58:36.366340   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-proxy --format={{.ID}}
	I0213 15:58:36.380162   10919 logs.go:276] 2 containers: [3a2c33446232 166517917ebe]
	I0213 15:58:36.380235   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-controller-manager --format={{.ID}}
	I0213 15:58:36.393512   10919 logs.go:276] 2 containers: [b2bc8bfc1796 f753753d8cc1]
	I0213 15:58:36.393590   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kindnet --format={{.ID}}
	I0213 15:58:36.406502   10919 logs.go:276] 0 containers: []
	W0213 15:58:36.406517   10919 logs.go:278] No container was found matching "kindnet"
	I0213 15:58:36.406583   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_storage-provisioner --format={{.ID}}
	I0213 15:58:36.420128   10919 logs.go:276] 2 containers: [35631bb46ae0 d2fb13268178]
	I0213 15:58:36.420206   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kubernetes-dashboard --format={{.ID}}
	I0213 15:58:36.434034   10919 logs.go:276] 1 containers: [9ad73b9e5c82]
	I0213 15:58:36.434052   10919 logs.go:123] Gathering logs for kube-apiserver [69aac7de960e] ...
	I0213 15:58:36.434059   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 69aac7de960e"
	I0213 15:58:36.472160   10919 logs.go:123] Gathering logs for coredns [dde67a3d6462] ...
	I0213 15:58:36.472179   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 dde67a3d6462"
	I0213 15:58:36.488346   10919 logs.go:123] Gathering logs for kube-scheduler [7516f7e0a7b3] ...
	I0213 15:58:36.488363   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 7516f7e0a7b3"
	I0213 15:58:36.513910   10919 logs.go:123] Gathering logs for kubernetes-dashboard [9ad73b9e5c82] ...
	I0213 15:58:36.513926   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 9ad73b9e5c82"
	I0213 15:58:36.530596   10919 logs.go:123] Gathering logs for etcd [8e7930e1af27] ...
	I0213 15:58:36.530610   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 8e7930e1af27"
	I0213 15:58:36.551985   10919 logs.go:123] Gathering logs for kube-scheduler [4a32e0095c50] ...
	I0213 15:58:36.551998   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 4a32e0095c50"
	I0213 15:58:36.567504   10919 logs.go:123] Gathering logs for kube-proxy [166517917ebe] ...
	I0213 15:58:36.567518   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 166517917ebe"
	I0213 15:58:36.582808   10919 logs.go:123] Gathering logs for kube-controller-manager [f753753d8cc1] ...
	I0213 15:58:36.582825   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 f753753d8cc1"
	I0213 15:58:36.607200   10919 logs.go:123] Gathering logs for kube-controller-manager [b2bc8bfc1796] ...
	I0213 15:58:36.607214   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 b2bc8bfc1796"
	I0213 15:58:36.637664   10919 logs.go:123] Gathering logs for storage-provisioner [35631bb46ae0] ...
	I0213 15:58:36.637678   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 35631bb46ae0"
	I0213 15:58:36.653824   10919 logs.go:123] Gathering logs for storage-provisioner [d2fb13268178] ...
	I0213 15:58:36.653838   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 d2fb13268178"
	I0213 15:58:36.669497   10919 logs.go:123] Gathering logs for dmesg ...
	I0213 15:58:36.669511   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I0213 15:58:36.683494   10919 logs.go:123] Gathering logs for describe nodes ...
	I0213 15:58:36.683509   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I0213 15:58:36.764702   10919 logs.go:123] Gathering logs for kube-apiserver [67b5fd979937] ...
	I0213 15:58:36.764717   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 67b5fd979937"
	I0213 15:58:36.789760   10919 logs.go:123] Gathering logs for etcd [33e10bdbe674] ...
	I0213 15:58:36.789775   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 33e10bdbe674"
	I0213 15:58:36.810124   10919 logs.go:123] Gathering logs for coredns [cf10ab2a5821] ...
	I0213 15:58:36.810137   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 cf10ab2a5821"
	I0213 15:58:36.824950   10919 logs.go:123] Gathering logs for Docker ...
	I0213 15:58:36.824967   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u docker -u cri-docker -n 400"
	I0213 15:58:36.863409   10919 logs.go:123] Gathering logs for container status ...
	I0213 15:58:36.863424   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I0213 15:58:36.909551   10919 logs.go:123] Gathering logs for kubelet ...
	I0213 15:58:36.909566   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I0213 15:58:36.953701   10919 logs.go:123] Gathering logs for kube-proxy [3a2c33446232] ...
	I0213 15:58:36.953717   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3a2c33446232"
	I0213 15:58:39.469147   10919 api_server.go:253] Checking apiserver healthz at https://192.169.0.44:8444/healthz ...
	I0213 15:58:44.470805   10919 api_server.go:269] stopped: https://192.169.0.44:8444/healthz: Get "https://192.169.0.44:8444/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I0213 15:58:44.470918   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-apiserver --format={{.ID}}
	I0213 15:58:44.484829   10919 logs.go:276] 2 containers: [67b5fd979937 69aac7de960e]
	I0213 15:58:44.484909   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_etcd --format={{.ID}}
	I0213 15:58:44.498572   10919 logs.go:276] 2 containers: [8e7930e1af27 33e10bdbe674]
	I0213 15:58:44.498647   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_coredns --format={{.ID}}
	I0213 15:58:44.511935   10919 logs.go:276] 2 containers: [dde67a3d6462 cf10ab2a5821]
	I0213 15:58:44.512014   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-scheduler --format={{.ID}}
	I0213 15:58:44.525419   10919 logs.go:276] 2 containers: [4a32e0095c50 7516f7e0a7b3]
	I0213 15:58:44.525494   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-proxy --format={{.ID}}
	I0213 15:58:44.539485   10919 logs.go:276] 2 containers: [3a2c33446232 166517917ebe]
	I0213 15:58:44.539562   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-controller-manager --format={{.ID}}
	I0213 15:58:44.553302   10919 logs.go:276] 2 containers: [b2bc8bfc1796 f753753d8cc1]
	I0213 15:58:44.553376   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kindnet --format={{.ID}}
	I0213 15:58:44.566445   10919 logs.go:276] 0 containers: []
	W0213 15:58:44.566458   10919 logs.go:278] No container was found matching "kindnet"
	I0213 15:58:44.566519   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kubernetes-dashboard --format={{.ID}}
	I0213 15:58:44.579632   10919 logs.go:276] 1 containers: [9ad73b9e5c82]
	I0213 15:58:44.579705   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_storage-provisioner --format={{.ID}}
	I0213 15:58:44.592578   10919 logs.go:276] 2 containers: [35631bb46ae0 d2fb13268178]
	I0213 15:58:44.592595   10919 logs.go:123] Gathering logs for kube-apiserver [67b5fd979937] ...
	I0213 15:58:44.592603   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 67b5fd979937"
	I0213 15:58:44.614429   10919 logs.go:123] Gathering logs for coredns [dde67a3d6462] ...
	I0213 15:58:44.614443   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 dde67a3d6462"
	I0213 15:58:44.629171   10919 logs.go:123] Gathering logs for coredns [cf10ab2a5821] ...
	I0213 15:58:44.629186   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 cf10ab2a5821"
	I0213 15:58:44.643591   10919 logs.go:123] Gathering logs for kube-scheduler [7516f7e0a7b3] ...
	I0213 15:58:44.643605   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 7516f7e0a7b3"
	I0213 15:58:44.662780   10919 logs.go:123] Gathering logs for kube-proxy [166517917ebe] ...
	I0213 15:58:44.662793   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 166517917ebe"
	I0213 15:58:44.677555   10919 logs.go:123] Gathering logs for kube-controller-manager [b2bc8bfc1796] ...
	I0213 15:58:44.677570   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 b2bc8bfc1796"
	I0213 15:58:44.703854   10919 logs.go:123] Gathering logs for kubernetes-dashboard [9ad73b9e5c82] ...
	I0213 15:58:44.703868   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 9ad73b9e5c82"
	I0213 15:58:44.721104   10919 logs.go:123] Gathering logs for container status ...
	I0213 15:58:44.721118   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I0213 15:58:44.778025   10919 logs.go:123] Gathering logs for kubelet ...
	I0213 15:58:44.778040   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I0213 15:58:44.822663   10919 logs.go:123] Gathering logs for describe nodes ...
	I0213 15:58:44.822678   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I0213 15:58:44.907422   10919 logs.go:123] Gathering logs for etcd [8e7930e1af27] ...
	I0213 15:58:44.907438   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 8e7930e1af27"
	I0213 15:58:44.928336   10919 logs.go:123] Gathering logs for etcd [33e10bdbe674] ...
	I0213 15:58:44.928351   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 33e10bdbe674"
	I0213 15:58:44.947877   10919 logs.go:123] Gathering logs for storage-provisioner [35631bb46ae0] ...
	I0213 15:58:44.947890   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 35631bb46ae0"
	I0213 15:58:44.963677   10919 logs.go:123] Gathering logs for dmesg ...
	I0213 15:58:44.963691   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I0213 15:58:44.972589   10919 logs.go:123] Gathering logs for kube-apiserver [69aac7de960e] ...
	I0213 15:58:44.972601   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 69aac7de960e"
	I0213 15:58:45.007401   10919 logs.go:123] Gathering logs for kube-scheduler [4a32e0095c50] ...
	I0213 15:58:45.007416   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 4a32e0095c50"
	I0213 15:58:45.022960   10919 logs.go:123] Gathering logs for kube-proxy [3a2c33446232] ...
	I0213 15:58:45.022975   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3a2c33446232"
	I0213 15:58:45.038517   10919 logs.go:123] Gathering logs for kube-controller-manager [f753753d8cc1] ...
	I0213 15:58:45.038532   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 f753753d8cc1"
	I0213 15:58:45.063302   10919 logs.go:123] Gathering logs for storage-provisioner [d2fb13268178] ...
	I0213 15:58:45.063315   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 d2fb13268178"
	I0213 15:58:45.078260   10919 logs.go:123] Gathering logs for Docker ...
	I0213 15:58:45.078275   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u docker -u cri-docker -n 400"
	I0213 15:58:47.616583   10919 api_server.go:253] Checking apiserver healthz at https://192.169.0.44:8444/healthz ...
	I0213 15:58:52.617986   10919 api_server.go:269] stopped: https://192.169.0.44:8444/healthz: Get "https://192.169.0.44:8444/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I0213 15:58:52.618188   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-apiserver --format={{.ID}}
	I0213 15:58:52.632946   10919 logs.go:276] 2 containers: [67b5fd979937 69aac7de960e]
	I0213 15:58:52.633028   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_etcd --format={{.ID}}
	I0213 15:58:52.646638   10919 logs.go:276] 2 containers: [8e7930e1af27 33e10bdbe674]
	I0213 15:58:52.646716   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_coredns --format={{.ID}}
	I0213 15:58:52.659674   10919 logs.go:276] 2 containers: [dde67a3d6462 cf10ab2a5821]
	I0213 15:58:52.659750   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-scheduler --format={{.ID}}
	I0213 15:58:52.672697   10919 logs.go:276] 2 containers: [4a32e0095c50 7516f7e0a7b3]
	I0213 15:58:52.672768   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-proxy --format={{.ID}}
	I0213 15:58:52.686194   10919 logs.go:276] 2 containers: [3a2c33446232 166517917ebe]
	I0213 15:58:52.686292   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-controller-manager --format={{.ID}}
	I0213 15:58:52.700472   10919 logs.go:276] 2 containers: [b2bc8bfc1796 f753753d8cc1]
	I0213 15:58:52.700547   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kindnet --format={{.ID}}
	I0213 15:58:52.713302   10919 logs.go:276] 0 containers: []
	W0213 15:58:52.713314   10919 logs.go:278] No container was found matching "kindnet"
	I0213 15:58:52.713373   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kubernetes-dashboard --format={{.ID}}
	I0213 15:58:52.727095   10919 logs.go:276] 1 containers: [9ad73b9e5c82]
	I0213 15:58:52.727173   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_storage-provisioner --format={{.ID}}
	I0213 15:58:52.742643   10919 logs.go:276] 2 containers: [35631bb46ae0 d2fb13268178]
	I0213 15:58:52.742660   10919 logs.go:123] Gathering logs for kube-apiserver [69aac7de960e] ...
	I0213 15:58:52.742666   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 69aac7de960e"
	I0213 15:58:52.777302   10919 logs.go:123] Gathering logs for coredns [cf10ab2a5821] ...
	I0213 15:58:52.777319   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 cf10ab2a5821"
	I0213 15:58:52.792386   10919 logs.go:123] Gathering logs for storage-provisioner [35631bb46ae0] ...
	I0213 15:58:52.792400   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 35631bb46ae0"
	I0213 15:58:52.809892   10919 logs.go:123] Gathering logs for storage-provisioner [d2fb13268178] ...
	I0213 15:58:52.809906   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 d2fb13268178"
	I0213 15:58:52.824877   10919 logs.go:123] Gathering logs for Docker ...
	I0213 15:58:52.824892   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u docker -u cri-docker -n 400"
	I0213 15:58:52.863902   10919 logs.go:123] Gathering logs for etcd [33e10bdbe674] ...
	I0213 15:58:52.863919   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 33e10bdbe674"
	I0213 15:58:52.883450   10919 logs.go:123] Gathering logs for kube-scheduler [4a32e0095c50] ...
	I0213 15:58:52.883464   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 4a32e0095c50"
	I0213 15:58:52.899266   10919 logs.go:123] Gathering logs for kube-scheduler [7516f7e0a7b3] ...
	I0213 15:58:52.899280   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 7516f7e0a7b3"
	I0213 15:58:52.919463   10919 logs.go:123] Gathering logs for dmesg ...
	I0213 15:58:52.919477   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I0213 15:58:52.928478   10919 logs.go:123] Gathering logs for describe nodes ...
	I0213 15:58:52.928490   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I0213 15:58:53.008640   10919 logs.go:123] Gathering logs for etcd [8e7930e1af27] ...
	I0213 15:58:53.008655   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 8e7930e1af27"
	I0213 15:58:53.027477   10919 logs.go:123] Gathering logs for kube-proxy [3a2c33446232] ...
	I0213 15:58:53.027492   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3a2c33446232"
	I0213 15:58:53.043267   10919 logs.go:123] Gathering logs for kube-proxy [166517917ebe] ...
	I0213 15:58:53.043283   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 166517917ebe"
	I0213 15:58:53.058954   10919 logs.go:123] Gathering logs for kube-controller-manager [f753753d8cc1] ...
	I0213 15:58:53.058968   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 f753753d8cc1"
	I0213 15:58:53.083822   10919 logs.go:123] Gathering logs for container status ...
	I0213 15:58:53.083837   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I0213 15:58:53.132917   10919 logs.go:123] Gathering logs for kubelet ...
	I0213 15:58:53.132932   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I0213 15:58:53.180247   10919 logs.go:123] Gathering logs for kube-apiserver [67b5fd979937] ...
	I0213 15:58:53.180265   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 67b5fd979937"
	I0213 15:58:53.201998   10919 logs.go:123] Gathering logs for coredns [dde67a3d6462] ...
	I0213 15:58:53.202011   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 dde67a3d6462"
	I0213 15:58:53.216567   10919 logs.go:123] Gathering logs for kube-controller-manager [b2bc8bfc1796] ...
	I0213 15:58:53.216580   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 b2bc8bfc1796"
	I0213 15:58:53.251793   10919 logs.go:123] Gathering logs for kubernetes-dashboard [9ad73b9e5c82] ...
	I0213 15:58:53.251810   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 9ad73b9e5c82"
	I0213 15:58:55.773514   10919 api_server.go:253] Checking apiserver healthz at https://192.169.0.44:8444/healthz ...
	I0213 15:59:00.774893   10919 api_server.go:269] stopped: https://192.169.0.44:8444/healthz: Get "https://192.169.0.44:8444/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I0213 15:59:00.775082   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-apiserver --format={{.ID}}
	I0213 15:59:00.789747   10919 logs.go:276] 2 containers: [67b5fd979937 69aac7de960e]
	I0213 15:59:00.789821   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_etcd --format={{.ID}}
	I0213 15:59:00.803604   10919 logs.go:276] 2 containers: [8e7930e1af27 33e10bdbe674]
	I0213 15:59:00.803677   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_coredns --format={{.ID}}
	I0213 15:59:00.817037   10919 logs.go:276] 2 containers: [dde67a3d6462 cf10ab2a5821]
	I0213 15:59:00.817112   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-scheduler --format={{.ID}}
	I0213 15:59:00.830762   10919 logs.go:276] 2 containers: [4a32e0095c50 7516f7e0a7b3]
	I0213 15:59:00.830837   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-proxy --format={{.ID}}
	I0213 15:59:00.844945   10919 logs.go:276] 2 containers: [3a2c33446232 166517917ebe]
	I0213 15:59:00.845020   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-controller-manager --format={{.ID}}
	I0213 15:59:00.868423   10919 logs.go:276] 2 containers: [b2bc8bfc1796 f753753d8cc1]
	I0213 15:59:00.868502   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kindnet --format={{.ID}}
	I0213 15:59:00.881896   10919 logs.go:276] 0 containers: []
	W0213 15:59:00.881910   10919 logs.go:278] No container was found matching "kindnet"
	I0213 15:59:00.881971   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kubernetes-dashboard --format={{.ID}}
	I0213 15:59:00.895790   10919 logs.go:276] 1 containers: [9ad73b9e5c82]
	I0213 15:59:00.895866   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_storage-provisioner --format={{.ID}}
	I0213 15:59:00.912210   10919 logs.go:276] 2 containers: [35631bb46ae0 d2fb13268178]
	I0213 15:59:00.912228   10919 logs.go:123] Gathering logs for Docker ...
	I0213 15:59:00.912235   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u docker -u cri-docker -n 400"
	I0213 15:59:00.952018   10919 logs.go:123] Gathering logs for kube-apiserver [69aac7de960e] ...
	I0213 15:59:00.952035   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 69aac7de960e"
	I0213 15:59:00.986394   10919 logs.go:123] Gathering logs for coredns [dde67a3d6462] ...
	I0213 15:59:00.986409   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 dde67a3d6462"
	I0213 15:59:01.000547   10919 logs.go:123] Gathering logs for storage-provisioner [d2fb13268178] ...
	I0213 15:59:01.000562   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 d2fb13268178"
	I0213 15:59:01.014604   10919 logs.go:123] Gathering logs for kube-scheduler [4a32e0095c50] ...
	I0213 15:59:01.014619   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 4a32e0095c50"
	I0213 15:59:01.030007   10919 logs.go:123] Gathering logs for kube-proxy [3a2c33446232] ...
	I0213 15:59:01.030021   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3a2c33446232"
	I0213 15:59:01.045648   10919 logs.go:123] Gathering logs for storage-provisioner [35631bb46ae0] ...
	I0213 15:59:01.045663   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 35631bb46ae0"
	I0213 15:59:01.061252   10919 logs.go:123] Gathering logs for dmesg ...
	I0213 15:59:01.061266   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I0213 15:59:01.070132   10919 logs.go:123] Gathering logs for describe nodes ...
	I0213 15:59:01.070144   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I0213 15:59:01.165950   10919 logs.go:123] Gathering logs for etcd [33e10bdbe674] ...
	I0213 15:59:01.165965   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 33e10bdbe674"
	I0213 15:59:01.189824   10919 logs.go:123] Gathering logs for kubelet ...
	I0213 15:59:01.189837   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I0213 15:59:01.236382   10919 logs.go:123] Gathering logs for etcd [8e7930e1af27] ...
	I0213 15:59:01.236397   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 8e7930e1af27"
	I0213 15:59:01.256182   10919 logs.go:123] Gathering logs for coredns [cf10ab2a5821] ...
	I0213 15:59:01.256198   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 cf10ab2a5821"
	I0213 15:59:01.271570   10919 logs.go:123] Gathering logs for kube-controller-manager [b2bc8bfc1796] ...
	I0213 15:59:01.271583   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 b2bc8bfc1796"
	I0213 15:59:01.302769   10919 logs.go:123] Gathering logs for kube-controller-manager [f753753d8cc1] ...
	I0213 15:59:01.302782   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 f753753d8cc1"
	I0213 15:59:01.327932   10919 logs.go:123] Gathering logs for kubernetes-dashboard [9ad73b9e5c82] ...
	I0213 15:59:01.327946   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 9ad73b9e5c82"
	I0213 15:59:01.344217   10919 logs.go:123] Gathering logs for container status ...
	I0213 15:59:01.344231   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I0213 15:59:01.397525   10919 logs.go:123] Gathering logs for kube-apiserver [67b5fd979937] ...
	I0213 15:59:01.397543   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 67b5fd979937"
	I0213 15:59:01.421046   10919 logs.go:123] Gathering logs for kube-scheduler [7516f7e0a7b3] ...
	I0213 15:59:01.421060   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 7516f7e0a7b3"
	I0213 15:59:01.441682   10919 logs.go:123] Gathering logs for kube-proxy [166517917ebe] ...
	I0213 15:59:01.441696   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 166517917ebe"
	I0213 15:59:03.959343   10919 api_server.go:253] Checking apiserver healthz at https://192.169.0.44:8444/healthz ...
	I0213 15:59:08.960695   10919 api_server.go:269] stopped: https://192.169.0.44:8444/healthz: Get "https://192.169.0.44:8444/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I0213 15:59:08.960915   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-apiserver --format={{.ID}}
	I0213 15:59:08.977278   10919 logs.go:276] 2 containers: [67b5fd979937 69aac7de960e]
	I0213 15:59:08.977352   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_etcd --format={{.ID}}
	I0213 15:59:08.990556   10919 logs.go:276] 2 containers: [8e7930e1af27 33e10bdbe674]
	I0213 15:59:08.990629   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_coredns --format={{.ID}}
	I0213 15:59:09.004400   10919 logs.go:276] 2 containers: [dde67a3d6462 cf10ab2a5821]
	I0213 15:59:09.004475   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-scheduler --format={{.ID}}
	I0213 15:59:09.018079   10919 logs.go:276] 2 containers: [4a32e0095c50 7516f7e0a7b3]
	I0213 15:59:09.018153   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-proxy --format={{.ID}}
	I0213 15:59:09.032172   10919 logs.go:276] 2 containers: [3a2c33446232 166517917ebe]
	I0213 15:59:09.032252   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-controller-manager --format={{.ID}}
	I0213 15:59:09.045875   10919 logs.go:276] 2 containers: [b2bc8bfc1796 f753753d8cc1]
	I0213 15:59:09.045946   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kindnet --format={{.ID}}
	I0213 15:59:09.062239   10919 logs.go:276] 0 containers: []
	W0213 15:59:09.062252   10919 logs.go:278] No container was found matching "kindnet"
	I0213 15:59:09.062320   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_storage-provisioner --format={{.ID}}
	I0213 15:59:09.075527   10919 logs.go:276] 2 containers: [35631bb46ae0 d2fb13268178]
	I0213 15:59:09.075605   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kubernetes-dashboard --format={{.ID}}
	I0213 15:59:09.089532   10919 logs.go:276] 1 containers: [9ad73b9e5c82]
	I0213 15:59:09.089550   10919 logs.go:123] Gathering logs for etcd [33e10bdbe674] ...
	I0213 15:59:09.089560   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 33e10bdbe674"
	I0213 15:59:09.113724   10919 logs.go:123] Gathering logs for kube-controller-manager [b2bc8bfc1796] ...
	I0213 15:59:09.113737   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 b2bc8bfc1796"
	I0213 15:59:09.141679   10919 logs.go:123] Gathering logs for storage-provisioner [35631bb46ae0] ...
	I0213 15:59:09.141692   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 35631bb46ae0"
	I0213 15:59:09.156088   10919 logs.go:123] Gathering logs for describe nodes ...
	I0213 15:59:09.156102   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I0213 15:59:09.240416   10919 logs.go:123] Gathering logs for kube-apiserver [67b5fd979937] ...
	I0213 15:59:09.240430   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 67b5fd979937"
	I0213 15:59:09.261440   10919 logs.go:123] Gathering logs for kube-apiserver [69aac7de960e] ...
	I0213 15:59:09.261454   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 69aac7de960e"
	I0213 15:59:09.294410   10919 logs.go:123] Gathering logs for kube-proxy [3a2c33446232] ...
	I0213 15:59:09.294425   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3a2c33446232"
	I0213 15:59:09.315250   10919 logs.go:123] Gathering logs for kube-proxy [166517917ebe] ...
	I0213 15:59:09.315264   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 166517917ebe"
	I0213 15:59:09.330639   10919 logs.go:123] Gathering logs for kubernetes-dashboard [9ad73b9e5c82] ...
	I0213 15:59:09.330652   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 9ad73b9e5c82"
	I0213 15:59:09.347260   10919 logs.go:123] Gathering logs for Docker ...
	I0213 15:59:09.347274   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u docker -u cri-docker -n 400"
	I0213 15:59:09.385978   10919 logs.go:123] Gathering logs for dmesg ...
	I0213 15:59:09.385993   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I0213 15:59:09.395338   10919 logs.go:123] Gathering logs for kube-scheduler [4a32e0095c50] ...
	I0213 15:59:09.395351   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 4a32e0095c50"
	I0213 15:59:09.410639   10919 logs.go:123] Gathering logs for kube-scheduler [7516f7e0a7b3] ...
	I0213 15:59:09.410653   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 7516f7e0a7b3"
	I0213 15:59:09.430825   10919 logs.go:123] Gathering logs for etcd [8e7930e1af27] ...
	I0213 15:59:09.430839   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 8e7930e1af27"
	I0213 15:59:09.451297   10919 logs.go:123] Gathering logs for coredns [cf10ab2a5821] ...
	I0213 15:59:09.451310   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 cf10ab2a5821"
	I0213 15:59:09.466343   10919 logs.go:123] Gathering logs for storage-provisioner [d2fb13268178] ...
	I0213 15:59:09.466357   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 d2fb13268178"
	I0213 15:59:09.481218   10919 logs.go:123] Gathering logs for container status ...
	I0213 15:59:09.481232   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I0213 15:59:09.530434   10919 logs.go:123] Gathering logs for kubelet ...
	I0213 15:59:09.530448   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I0213 15:59:09.579432   10919 logs.go:123] Gathering logs for coredns [dde67a3d6462] ...
	I0213 15:59:09.579448   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 dde67a3d6462"
	I0213 15:59:09.594401   10919 logs.go:123] Gathering logs for kube-controller-manager [f753753d8cc1] ...
	I0213 15:59:09.594415   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 f753753d8cc1"
	I0213 15:59:12.120513   10919 api_server.go:253] Checking apiserver healthz at https://192.169.0.44:8444/healthz ...
	I0213 15:59:17.121483   10919 api_server.go:269] stopped: https://192.169.0.44:8444/healthz: Get "https://192.169.0.44:8444/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I0213 15:59:17.121632   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-apiserver --format={{.ID}}
	I0213 15:59:17.137144   10919 logs.go:276] 2 containers: [67b5fd979937 69aac7de960e]
	I0213 15:59:17.137219   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_etcd --format={{.ID}}
	I0213 15:59:17.152021   10919 logs.go:276] 2 containers: [8e7930e1af27 33e10bdbe674]
	I0213 15:59:17.152103   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_coredns --format={{.ID}}
	I0213 15:59:17.166098   10919 logs.go:276] 2 containers: [dde67a3d6462 cf10ab2a5821]
	I0213 15:59:17.166181   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-scheduler --format={{.ID}}
	I0213 15:59:17.179857   10919 logs.go:276] 2 containers: [4a32e0095c50 7516f7e0a7b3]
	I0213 15:59:17.179935   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-proxy --format={{.ID}}
	I0213 15:59:17.193467   10919 logs.go:276] 2 containers: [3a2c33446232 166517917ebe]
	I0213 15:59:17.193552   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-controller-manager --format={{.ID}}
	I0213 15:59:17.206927   10919 logs.go:276] 2 containers: [b2bc8bfc1796 f753753d8cc1]
	I0213 15:59:17.207000   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kindnet --format={{.ID}}
	I0213 15:59:17.220182   10919 logs.go:276] 0 containers: []
	W0213 15:59:17.220199   10919 logs.go:278] No container was found matching "kindnet"
	I0213 15:59:17.220269   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_storage-provisioner --format={{.ID}}
	I0213 15:59:17.233560   10919 logs.go:276] 2 containers: [35631bb46ae0 d2fb13268178]
	I0213 15:59:17.233650   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kubernetes-dashboard --format={{.ID}}
	I0213 15:59:17.247755   10919 logs.go:276] 1 containers: [9ad73b9e5c82]
	I0213 15:59:17.247772   10919 logs.go:123] Gathering logs for kube-proxy [166517917ebe] ...
	I0213 15:59:17.247780   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 166517917ebe"
	I0213 15:59:17.264129   10919 logs.go:123] Gathering logs for storage-provisioner [35631bb46ae0] ...
	I0213 15:59:17.264143   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 35631bb46ae0"
	I0213 15:59:17.279330   10919 logs.go:123] Gathering logs for describe nodes ...
	I0213 15:59:17.279344   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I0213 15:59:17.366117   10919 logs.go:123] Gathering logs for kube-apiserver [67b5fd979937] ...
	I0213 15:59:17.366132   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 67b5fd979937"
	I0213 15:59:17.390668   10919 logs.go:123] Gathering logs for etcd [33e10bdbe674] ...
	I0213 15:59:17.390682   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 33e10bdbe674"
	I0213 15:59:17.409796   10919 logs.go:123] Gathering logs for coredns [cf10ab2a5821] ...
	I0213 15:59:17.409811   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 cf10ab2a5821"
	I0213 15:59:17.425104   10919 logs.go:123] Gathering logs for etcd [8e7930e1af27] ...
	I0213 15:59:17.425118   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 8e7930e1af27"
	I0213 15:59:17.445112   10919 logs.go:123] Gathering logs for kube-proxy [3a2c33446232] ...
	I0213 15:59:17.445127   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3a2c33446232"
	I0213 15:59:17.461454   10919 logs.go:123] Gathering logs for kube-controller-manager [f753753d8cc1] ...
	I0213 15:59:17.461469   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 f753753d8cc1"
	I0213 15:59:17.487113   10919 logs.go:123] Gathering logs for container status ...
	I0213 15:59:17.487127   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I0213 15:59:17.542596   10919 logs.go:123] Gathering logs for kube-scheduler [7516f7e0a7b3] ...
	I0213 15:59:17.542614   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 7516f7e0a7b3"
	I0213 15:59:17.564814   10919 logs.go:123] Gathering logs for kube-controller-manager [b2bc8bfc1796] ...
	I0213 15:59:17.564827   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 b2bc8bfc1796"
	I0213 15:59:17.592379   10919 logs.go:123] Gathering logs for storage-provisioner [d2fb13268178] ...
	I0213 15:59:17.592393   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 d2fb13268178"
	I0213 15:59:17.606789   10919 logs.go:123] Gathering logs for kubernetes-dashboard [9ad73b9e5c82] ...
	I0213 15:59:17.606803   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 9ad73b9e5c82"
	I0213 15:59:17.622755   10919 logs.go:123] Gathering logs for dmesg ...
	I0213 15:59:17.622771   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I0213 15:59:17.631637   10919 logs.go:123] Gathering logs for kube-apiserver [69aac7de960e] ...
	I0213 15:59:17.631649   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 69aac7de960e"
	I0213 15:59:17.665166   10919 logs.go:123] Gathering logs for coredns [dde67a3d6462] ...
	I0213 15:59:17.665180   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 dde67a3d6462"
	I0213 15:59:17.682809   10919 logs.go:123] Gathering logs for kube-scheduler [4a32e0095c50] ...
	I0213 15:59:17.682823   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 4a32e0095c50"
	I0213 15:59:17.701872   10919 logs.go:123] Gathering logs for Docker ...
	I0213 15:59:17.701886   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u docker -u cri-docker -n 400"
	I0213 15:59:17.739442   10919 logs.go:123] Gathering logs for kubelet ...
	I0213 15:59:17.739459   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I0213 15:59:20.291259   10919 api_server.go:253] Checking apiserver healthz at https://192.169.0.44:8444/healthz ...
	I0213 15:59:25.292441   10919 api_server.go:269] stopped: https://192.169.0.44:8444/healthz: Get "https://192.169.0.44:8444/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I0213 15:59:25.292614   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-apiserver --format={{.ID}}
	I0213 15:59:25.307677   10919 logs.go:276] 2 containers: [67b5fd979937 69aac7de960e]
	I0213 15:59:25.307751   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_etcd --format={{.ID}}
	I0213 15:59:25.320612   10919 logs.go:276] 2 containers: [8e7930e1af27 33e10bdbe674]
	I0213 15:59:25.320688   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_coredns --format={{.ID}}
	I0213 15:59:25.334187   10919 logs.go:276] 2 containers: [dde67a3d6462 cf10ab2a5821]
	I0213 15:59:25.334261   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-scheduler --format={{.ID}}
	I0213 15:59:25.351822   10919 logs.go:276] 2 containers: [4a32e0095c50 7516f7e0a7b3]
	I0213 15:59:25.351901   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-proxy --format={{.ID}}
	I0213 15:59:25.365474   10919 logs.go:276] 2 containers: [3a2c33446232 166517917ebe]
	I0213 15:59:25.365551   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-controller-manager --format={{.ID}}
	I0213 15:59:25.379300   10919 logs.go:276] 2 containers: [b2bc8bfc1796 f753753d8cc1]
	I0213 15:59:25.379380   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kindnet --format={{.ID}}
	I0213 15:59:25.393447   10919 logs.go:276] 0 containers: []
	W0213 15:59:25.393460   10919 logs.go:278] No container was found matching "kindnet"
	I0213 15:59:25.393522   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kubernetes-dashboard --format={{.ID}}
	I0213 15:59:25.407596   10919 logs.go:276] 1 containers: [9ad73b9e5c82]
	I0213 15:59:25.407669   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_storage-provisioner --format={{.ID}}
	I0213 15:59:25.421164   10919 logs.go:276] 2 containers: [35631bb46ae0 d2fb13268178]
	I0213 15:59:25.421184   10919 logs.go:123] Gathering logs for etcd [8e7930e1af27] ...
	I0213 15:59:25.421196   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 8e7930e1af27"
	I0213 15:59:25.440505   10919 logs.go:123] Gathering logs for etcd [33e10bdbe674] ...
	I0213 15:59:25.440519   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 33e10bdbe674"
	I0213 15:59:25.461812   10919 logs.go:123] Gathering logs for coredns [cf10ab2a5821] ...
	I0213 15:59:25.461825   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 cf10ab2a5821"
	I0213 15:59:25.477201   10919 logs.go:123] Gathering logs for kube-scheduler [7516f7e0a7b3] ...
	I0213 15:59:25.477215   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 7516f7e0a7b3"
	I0213 15:59:25.498466   10919 logs.go:123] Gathering logs for kube-controller-manager [f753753d8cc1] ...
	I0213 15:59:25.498482   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 f753753d8cc1"
	I0213 15:59:25.523699   10919 logs.go:123] Gathering logs for kubelet ...
	I0213 15:59:25.523713   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I0213 15:59:25.571808   10919 logs.go:123] Gathering logs for coredns [dde67a3d6462] ...
	I0213 15:59:25.571822   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 dde67a3d6462"
	I0213 15:59:25.586771   10919 logs.go:123] Gathering logs for kubernetes-dashboard [9ad73b9e5c82] ...
	I0213 15:59:25.586786   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 9ad73b9e5c82"
	I0213 15:59:25.602949   10919 logs.go:123] Gathering logs for storage-provisioner [d2fb13268178] ...
	I0213 15:59:25.602962   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 d2fb13268178"
	I0213 15:59:25.617591   10919 logs.go:123] Gathering logs for container status ...
	I0213 15:59:25.617605   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I0213 15:59:25.664638   10919 logs.go:123] Gathering logs for kube-controller-manager [b2bc8bfc1796] ...
	I0213 15:59:25.664653   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 b2bc8bfc1796"
	I0213 15:59:25.691945   10919 logs.go:123] Gathering logs for dmesg ...
	I0213 15:59:25.691959   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I0213 15:59:25.701086   10919 logs.go:123] Gathering logs for describe nodes ...
	I0213 15:59:25.701097   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I0213 15:59:25.786038   10919 logs.go:123] Gathering logs for kube-apiserver [67b5fd979937] ...
	I0213 15:59:25.786053   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 67b5fd979937"
	I0213 15:59:25.809714   10919 logs.go:123] Gathering logs for kube-apiserver [69aac7de960e] ...
	I0213 15:59:25.809729   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 69aac7de960e"
	I0213 15:59:25.843727   10919 logs.go:123] Gathering logs for kube-scheduler [4a32e0095c50] ...
	I0213 15:59:25.843744   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 4a32e0095c50"
	I0213 15:59:25.859682   10919 logs.go:123] Gathering logs for kube-proxy [3a2c33446232] ...
	I0213 15:59:25.859696   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3a2c33446232"
	I0213 15:59:25.876060   10919 logs.go:123] Gathering logs for kube-proxy [166517917ebe] ...
	I0213 15:59:25.876073   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 166517917ebe"
	I0213 15:59:25.892578   10919 logs.go:123] Gathering logs for storage-provisioner [35631bb46ae0] ...
	I0213 15:59:25.892593   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 35631bb46ae0"
	I0213 15:59:25.909618   10919 logs.go:123] Gathering logs for Docker ...
	I0213 15:59:25.909632   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u docker -u cri-docker -n 400"
	I0213 15:59:28.448708   10919 api_server.go:253] Checking apiserver healthz at https://192.169.0.44:8444/healthz ...
	I0213 15:59:33.449449   10919 api_server.go:269] stopped: https://192.169.0.44:8444/healthz: Get "https://192.169.0.44:8444/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I0213 15:59:33.449662   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-apiserver --format={{.ID}}
	I0213 15:59:33.464758   10919 logs.go:276] 2 containers: [67b5fd979937 69aac7de960e]
	I0213 15:59:33.464832   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_etcd --format={{.ID}}
	I0213 15:59:33.477602   10919 logs.go:276] 2 containers: [8e7930e1af27 33e10bdbe674]
	I0213 15:59:33.477681   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_coredns --format={{.ID}}
	I0213 15:59:33.490971   10919 logs.go:276] 2 containers: [dde67a3d6462 cf10ab2a5821]
	I0213 15:59:33.491050   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-scheduler --format={{.ID}}
	I0213 15:59:33.504512   10919 logs.go:276] 2 containers: [4a32e0095c50 7516f7e0a7b3]
	I0213 15:59:33.504585   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-proxy --format={{.ID}}
	I0213 15:59:33.517341   10919 logs.go:276] 2 containers: [3a2c33446232 166517917ebe]
	I0213 15:59:33.517418   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-controller-manager --format={{.ID}}
	I0213 15:59:33.532977   10919 logs.go:276] 2 containers: [b2bc8bfc1796 f753753d8cc1]
	I0213 15:59:33.533053   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kindnet --format={{.ID}}
	I0213 15:59:33.546453   10919 logs.go:276] 0 containers: []
	W0213 15:59:33.546467   10919 logs.go:278] No container was found matching "kindnet"
	I0213 15:59:33.546530   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kubernetes-dashboard --format={{.ID}}
	I0213 15:59:33.559697   10919 logs.go:276] 1 containers: [9ad73b9e5c82]
	I0213 15:59:33.559775   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_storage-provisioner --format={{.ID}}
	I0213 15:59:33.575898   10919 logs.go:276] 2 containers: [35631bb46ae0 d2fb13268178]
	I0213 15:59:33.575916   10919 logs.go:123] Gathering logs for kubelet ...
	I0213 15:59:33.575923   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I0213 15:59:33.622128   10919 logs.go:123] Gathering logs for kube-apiserver [69aac7de960e] ...
	I0213 15:59:33.622142   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 69aac7de960e"
	I0213 15:59:33.664103   10919 logs.go:123] Gathering logs for storage-provisioner [35631bb46ae0] ...
	I0213 15:59:33.664118   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 35631bb46ae0"
	I0213 15:59:33.679088   10919 logs.go:123] Gathering logs for etcd [8e7930e1af27] ...
	I0213 15:59:33.679102   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 8e7930e1af27"
	I0213 15:59:33.701479   10919 logs.go:123] Gathering logs for kube-controller-manager [b2bc8bfc1796] ...
	I0213 15:59:33.701494   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 b2bc8bfc1796"
	I0213 15:59:33.729572   10919 logs.go:123] Gathering logs for kube-controller-manager [f753753d8cc1] ...
	I0213 15:59:33.729586   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 f753753d8cc1"
	I0213 15:59:33.759269   10919 logs.go:123] Gathering logs for coredns [cf10ab2a5821] ...
	I0213 15:59:33.759286   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 cf10ab2a5821"
	I0213 15:59:33.773869   10919 logs.go:123] Gathering logs for kube-scheduler [4a32e0095c50] ...
	I0213 15:59:33.773882   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 4a32e0095c50"
	I0213 15:59:33.788257   10919 logs.go:123] Gathering logs for kube-scheduler [7516f7e0a7b3] ...
	I0213 15:59:33.788271   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 7516f7e0a7b3"
	I0213 15:59:33.808152   10919 logs.go:123] Gathering logs for kubernetes-dashboard [9ad73b9e5c82] ...
	I0213 15:59:33.808165   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 9ad73b9e5c82"
	I0213 15:59:33.824757   10919 logs.go:123] Gathering logs for Docker ...
	I0213 15:59:33.824771   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u docker -u cri-docker -n 400"
	I0213 15:59:33.863852   10919 logs.go:123] Gathering logs for dmesg ...
	I0213 15:59:33.863868   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I0213 15:59:33.873539   10919 logs.go:123] Gathering logs for describe nodes ...
	I0213 15:59:33.873551   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I0213 15:59:33.954511   10919 logs.go:123] Gathering logs for kube-apiserver [67b5fd979937] ...
	I0213 15:59:33.954526   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 67b5fd979937"
	I0213 15:59:33.979251   10919 logs.go:123] Gathering logs for container status ...
	I0213 15:59:33.979265   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I0213 15:59:34.034034   10919 logs.go:123] Gathering logs for kube-proxy [166517917ebe] ...
	I0213 15:59:34.034048   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 166517917ebe"
	I0213 15:59:34.053456   10919 logs.go:123] Gathering logs for storage-provisioner [d2fb13268178] ...
	I0213 15:59:34.053470   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 d2fb13268178"
	I0213 15:59:34.068914   10919 logs.go:123] Gathering logs for etcd [33e10bdbe674] ...
	I0213 15:59:34.068928   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 33e10bdbe674"
	I0213 15:59:34.088965   10919 logs.go:123] Gathering logs for coredns [dde67a3d6462] ...
	I0213 15:59:34.088978   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 dde67a3d6462"
	I0213 15:59:34.108486   10919 logs.go:123] Gathering logs for kube-proxy [3a2c33446232] ...
	I0213 15:59:34.108503   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3a2c33446232"
	I0213 15:59:36.624047   10919 api_server.go:253] Checking apiserver healthz at https://192.169.0.44:8444/healthz ...
	I0213 15:59:41.624465   10919 api_server.go:269] stopped: https://192.169.0.44:8444/healthz: Get "https://192.169.0.44:8444/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I0213 15:59:41.624739   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-apiserver --format={{.ID}}
	I0213 15:59:41.639576   10919 logs.go:276] 2 containers: [67b5fd979937 69aac7de960e]
	I0213 15:59:41.639652   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_etcd --format={{.ID}}
	I0213 15:59:41.653332   10919 logs.go:276] 2 containers: [8e7930e1af27 33e10bdbe674]
	I0213 15:59:41.653408   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_coredns --format={{.ID}}
	I0213 15:59:41.666462   10919 logs.go:276] 2 containers: [dde67a3d6462 cf10ab2a5821]
	I0213 15:59:41.666538   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-scheduler --format={{.ID}}
	I0213 15:59:41.679683   10919 logs.go:276] 2 containers: [4a32e0095c50 7516f7e0a7b3]
	I0213 15:59:41.679765   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-proxy --format={{.ID}}
	I0213 15:59:41.694835   10919 logs.go:276] 2 containers: [3a2c33446232 166517917ebe]
	I0213 15:59:41.694912   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-controller-manager --format={{.ID}}
	I0213 15:59:41.708619   10919 logs.go:276] 2 containers: [b2bc8bfc1796 f753753d8cc1]
	I0213 15:59:41.708699   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kindnet --format={{.ID}}
	I0213 15:59:41.721413   10919 logs.go:276] 0 containers: []
	W0213 15:59:41.721427   10919 logs.go:278] No container was found matching "kindnet"
	I0213 15:59:41.721491   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_storage-provisioner --format={{.ID}}
	I0213 15:59:41.734194   10919 logs.go:276] 2 containers: [35631bb46ae0 d2fb13268178]
	I0213 15:59:41.734266   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kubernetes-dashboard --format={{.ID}}
	I0213 15:59:41.751013   10919 logs.go:276] 1 containers: [9ad73b9e5c82]
	I0213 15:59:41.751032   10919 logs.go:123] Gathering logs for etcd [8e7930e1af27] ...
	I0213 15:59:41.751039   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 8e7930e1af27"
	I0213 15:59:41.769118   10919 logs.go:123] Gathering logs for etcd [33e10bdbe674] ...
	I0213 15:59:41.769137   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 33e10bdbe674"
	I0213 15:59:41.788721   10919 logs.go:123] Gathering logs for coredns [dde67a3d6462] ...
	I0213 15:59:41.788736   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 dde67a3d6462"
	I0213 15:59:41.803708   10919 logs.go:123] Gathering logs for kube-proxy [3a2c33446232] ...
	I0213 15:59:41.803723   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3a2c33446232"
	I0213 15:59:41.818957   10919 logs.go:123] Gathering logs for kube-proxy [166517917ebe] ...
	I0213 15:59:41.818973   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 166517917ebe"
	I0213 15:59:41.835913   10919 logs.go:123] Gathering logs for kube-controller-manager [b2bc8bfc1796] ...
	I0213 15:59:41.835928   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 b2bc8bfc1796"
	I0213 15:59:41.867520   10919 logs.go:123] Gathering logs for kube-controller-manager [f753753d8cc1] ...
	I0213 15:59:41.867534   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 f753753d8cc1"
	I0213 15:59:41.892210   10919 logs.go:123] Gathering logs for kube-apiserver [69aac7de960e] ...
	I0213 15:59:41.892224   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 69aac7de960e"
	I0213 15:59:41.927167   10919 logs.go:123] Gathering logs for Docker ...
	I0213 15:59:41.927182   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u docker -u cri-docker -n 400"
	I0213 15:59:41.964858   10919 logs.go:123] Gathering logs for dmesg ...
	I0213 15:59:41.964873   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I0213 15:59:41.974308   10919 logs.go:123] Gathering logs for coredns [cf10ab2a5821] ...
	I0213 15:59:41.974320   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 cf10ab2a5821"
	I0213 15:59:41.989254   10919 logs.go:123] Gathering logs for storage-provisioner [35631bb46ae0] ...
	I0213 15:59:41.989269   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 35631bb46ae0"
	I0213 15:59:42.004073   10919 logs.go:123] Gathering logs for storage-provisioner [d2fb13268178] ...
	I0213 15:59:42.004088   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 d2fb13268178"
	I0213 15:59:42.018622   10919 logs.go:123] Gathering logs for kubernetes-dashboard [9ad73b9e5c82] ...
	I0213 15:59:42.018637   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 9ad73b9e5c82"
	I0213 15:59:42.033669   10919 logs.go:123] Gathering logs for container status ...
	I0213 15:59:42.033683   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I0213 15:59:42.086357   10919 logs.go:123] Gathering logs for kubelet ...
	I0213 15:59:42.086372   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I0213 15:59:42.138828   10919 logs.go:123] Gathering logs for describe nodes ...
	I0213 15:59:42.138847   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I0213 15:59:42.277127   10919 logs.go:123] Gathering logs for kube-apiserver [67b5fd979937] ...
	I0213 15:59:42.277142   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 67b5fd979937"
	I0213 15:59:42.300124   10919 logs.go:123] Gathering logs for kube-scheduler [4a32e0095c50] ...
	I0213 15:59:42.300139   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 4a32e0095c50"
	I0213 15:59:42.315570   10919 logs.go:123] Gathering logs for kube-scheduler [7516f7e0a7b3] ...
	I0213 15:59:42.315585   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 7516f7e0a7b3"
	I0213 15:59:44.835830   10919 api_server.go:253] Checking apiserver healthz at https://192.169.0.44:8444/healthz ...
	I0213 15:59:49.836694   10919 api_server.go:269] stopped: https://192.169.0.44:8444/healthz: Get "https://192.169.0.44:8444/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I0213 15:59:49.836877   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-apiserver --format={{.ID}}
	I0213 15:59:49.855913   10919 logs.go:276] 2 containers: [67b5fd979937 69aac7de960e]
	I0213 15:59:49.855994   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_etcd --format={{.ID}}
	I0213 15:59:49.869161   10919 logs.go:276] 2 containers: [8e7930e1af27 33e10bdbe674]
	I0213 15:59:49.869239   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_coredns --format={{.ID}}
	I0213 15:59:49.882361   10919 logs.go:276] 2 containers: [dde67a3d6462 cf10ab2a5821]
	I0213 15:59:49.882437   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-scheduler --format={{.ID}}
	I0213 15:59:49.895780   10919 logs.go:276] 2 containers: [4a32e0095c50 7516f7e0a7b3]
	I0213 15:59:49.895856   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-proxy --format={{.ID}}
	I0213 15:59:49.909340   10919 logs.go:276] 2 containers: [3a2c33446232 166517917ebe]
	I0213 15:59:49.909419   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-controller-manager --format={{.ID}}
	I0213 15:59:49.922658   10919 logs.go:276] 2 containers: [b2bc8bfc1796 f753753d8cc1]
	I0213 15:59:49.922732   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kindnet --format={{.ID}}
	I0213 15:59:49.935216   10919 logs.go:276] 0 containers: []
	W0213 15:59:49.935229   10919 logs.go:278] No container was found matching "kindnet"
	I0213 15:59:49.935294   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kubernetes-dashboard --format={{.ID}}
	I0213 15:59:49.948440   10919 logs.go:276] 1 containers: [9ad73b9e5c82]
	I0213 15:59:49.948515   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_storage-provisioner --format={{.ID}}
	I0213 15:59:49.961433   10919 logs.go:276] 2 containers: [35631bb46ae0 d2fb13268178]
	I0213 15:59:49.961469   10919 logs.go:123] Gathering logs for coredns [cf10ab2a5821] ...
	I0213 15:59:49.961477   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 cf10ab2a5821"
	I0213 15:59:49.975966   10919 logs.go:123] Gathering logs for kube-proxy [166517917ebe] ...
	I0213 15:59:49.975981   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 166517917ebe"
	I0213 15:59:49.992003   10919 logs.go:123] Gathering logs for kubernetes-dashboard [9ad73b9e5c82] ...
	I0213 15:59:49.992017   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 9ad73b9e5c82"
	I0213 15:59:50.013718   10919 logs.go:123] Gathering logs for Docker ...
	I0213 15:59:50.013731   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u docker -u cri-docker -n 400"
	I0213 15:59:50.052271   10919 logs.go:123] Gathering logs for describe nodes ...
	I0213 15:59:50.052289   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I0213 15:59:50.131063   10919 logs.go:123] Gathering logs for etcd [8e7930e1af27] ...
	I0213 15:59:50.131077   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 8e7930e1af27"
	I0213 15:59:50.151187   10919 logs.go:123] Gathering logs for etcd [33e10bdbe674] ...
	I0213 15:59:50.151200   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 33e10bdbe674"
	I0213 15:59:50.171771   10919 logs.go:123] Gathering logs for storage-provisioner [d2fb13268178] ...
	I0213 15:59:50.171785   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 d2fb13268178"
	I0213 15:59:50.186696   10919 logs.go:123] Gathering logs for container status ...
	I0213 15:59:50.186711   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I0213 15:59:50.240751   10919 logs.go:123] Gathering logs for kubelet ...
	I0213 15:59:50.240765   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I0213 15:59:50.289458   10919 logs.go:123] Gathering logs for kube-apiserver [67b5fd979937] ...
	I0213 15:59:50.289472   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 67b5fd979937"
	I0213 15:59:50.311569   10919 logs.go:123] Gathering logs for kube-apiserver [69aac7de960e] ...
	I0213 15:59:50.311583   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 69aac7de960e"
	I0213 15:59:50.348251   10919 logs.go:123] Gathering logs for storage-provisioner [35631bb46ae0] ...
	I0213 15:59:50.348265   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 35631bb46ae0"
	I0213 15:59:50.364392   10919 logs.go:123] Gathering logs for kube-scheduler [4a32e0095c50] ...
	I0213 15:59:50.364406   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 4a32e0095c50"
	I0213 15:59:50.380265   10919 logs.go:123] Gathering logs for kube-proxy [3a2c33446232] ...
	I0213 15:59:50.380279   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3a2c33446232"
	I0213 15:59:50.395547   10919 logs.go:123] Gathering logs for kube-controller-manager [f753753d8cc1] ...
	I0213 15:59:50.395563   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 f753753d8cc1"
	I0213 15:59:50.421830   10919 logs.go:123] Gathering logs for kube-controller-manager [b2bc8bfc1796] ...
	I0213 15:59:50.421845   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 b2bc8bfc1796"
	I0213 15:59:50.448828   10919 logs.go:123] Gathering logs for dmesg ...
	I0213 15:59:50.448842   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I0213 15:59:50.457957   10919 logs.go:123] Gathering logs for coredns [dde67a3d6462] ...
	I0213 15:59:50.457968   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 dde67a3d6462"
	I0213 15:59:50.471926   10919 logs.go:123] Gathering logs for kube-scheduler [7516f7e0a7b3] ...
	I0213 15:59:50.471941   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 7516f7e0a7b3"
	I0213 15:59:52.991666   10919 api_server.go:253] Checking apiserver healthz at https://192.169.0.44:8444/healthz ...
	I0213 15:59:57.992310   10919 api_server.go:269] stopped: https://192.169.0.44:8444/healthz: Get "https://192.169.0.44:8444/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I0213 15:59:57.992409   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-apiserver --format={{.ID}}
	I0213 15:59:58.005371   10919 logs.go:276] 2 containers: [67b5fd979937 69aac7de960e]
	I0213 15:59:58.005442   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_etcd --format={{.ID}}
	I0213 15:59:58.018712   10919 logs.go:276] 2 containers: [8e7930e1af27 33e10bdbe674]
	I0213 15:59:58.018786   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_coredns --format={{.ID}}
	I0213 15:59:58.032644   10919 logs.go:276] 2 containers: [dde67a3d6462 cf10ab2a5821]
	I0213 15:59:58.032718   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-scheduler --format={{.ID}}
	I0213 15:59:58.049213   10919 logs.go:276] 2 containers: [4a32e0095c50 7516f7e0a7b3]
	I0213 15:59:58.049290   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-proxy --format={{.ID}}
	I0213 15:59:58.062852   10919 logs.go:276] 2 containers: [3a2c33446232 166517917ebe]
	I0213 15:59:58.062931   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-controller-manager --format={{.ID}}
	I0213 15:59:58.076707   10919 logs.go:276] 2 containers: [b2bc8bfc1796 f753753d8cc1]
	I0213 15:59:58.076784   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kindnet --format={{.ID}}
	I0213 15:59:58.089760   10919 logs.go:276] 0 containers: []
	W0213 15:59:58.089773   10919 logs.go:278] No container was found matching "kindnet"
	I0213 15:59:58.089836   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kubernetes-dashboard --format={{.ID}}
	I0213 15:59:58.102591   10919 logs.go:276] 1 containers: [9ad73b9e5c82]
	I0213 15:59:58.102665   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_storage-provisioner --format={{.ID}}
	I0213 15:59:58.115869   10919 logs.go:276] 2 containers: [35631bb46ae0 d2fb13268178]
	I0213 15:59:58.115888   10919 logs.go:123] Gathering logs for coredns [cf10ab2a5821] ...
	I0213 15:59:58.115894   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 cf10ab2a5821"
	I0213 15:59:58.130637   10919 logs.go:123] Gathering logs for kube-controller-manager [f753753d8cc1] ...
	I0213 15:59:58.130651   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 f753753d8cc1"
	I0213 15:59:58.157833   10919 logs.go:123] Gathering logs for kubernetes-dashboard [9ad73b9e5c82] ...
	I0213 15:59:58.157848   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 9ad73b9e5c82"
	I0213 15:59:58.175159   10919 logs.go:123] Gathering logs for Docker ...
	I0213 15:59:58.175172   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u docker -u cri-docker -n 400"
	I0213 15:59:58.215113   10919 logs.go:123] Gathering logs for kube-apiserver [67b5fd979937] ...
	I0213 15:59:58.215127   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 67b5fd979937"
	I0213 15:59:58.238295   10919 logs.go:123] Gathering logs for etcd [8e7930e1af27] ...
	I0213 15:59:58.238310   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 8e7930e1af27"
	I0213 15:59:58.258695   10919 logs.go:123] Gathering logs for kube-apiserver [69aac7de960e] ...
	I0213 15:59:58.258711   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 69aac7de960e"
	I0213 15:59:58.296331   10919 logs.go:123] Gathering logs for coredns [dde67a3d6462] ...
	I0213 15:59:58.296345   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 dde67a3d6462"
	I0213 15:59:58.311062   10919 logs.go:123] Gathering logs for kube-scheduler [4a32e0095c50] ...
	I0213 15:59:58.311075   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 4a32e0095c50"
	I0213 15:59:58.325980   10919 logs.go:123] Gathering logs for kube-proxy [3a2c33446232] ...
	I0213 15:59:58.325995   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3a2c33446232"
	I0213 15:59:58.342720   10919 logs.go:123] Gathering logs for kube-proxy [166517917ebe] ...
	I0213 15:59:58.342735   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 166517917ebe"
	I0213 15:59:58.359299   10919 logs.go:123] Gathering logs for kube-controller-manager [b2bc8bfc1796] ...
	I0213 15:59:58.359312   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 b2bc8bfc1796"
	I0213 15:59:58.385858   10919 logs.go:123] Gathering logs for kubelet ...
	I0213 15:59:58.385870   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I0213 15:59:58.433447   10919 logs.go:123] Gathering logs for dmesg ...
	I0213 15:59:58.433463   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I0213 15:59:58.443537   10919 logs.go:123] Gathering logs for container status ...
	I0213 15:59:58.443551   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I0213 15:59:58.498490   10919 logs.go:123] Gathering logs for storage-provisioner [35631bb46ae0] ...
	I0213 15:59:58.498503   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 35631bb46ae0"
	I0213 15:59:58.512792   10919 logs.go:123] Gathering logs for storage-provisioner [d2fb13268178] ...
	I0213 15:59:58.512806   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 d2fb13268178"
	I0213 15:59:58.526963   10919 logs.go:123] Gathering logs for kube-scheduler [7516f7e0a7b3] ...
	I0213 15:59:58.526976   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 7516f7e0a7b3"
	I0213 15:59:58.546801   10919 logs.go:123] Gathering logs for describe nodes ...
	I0213 15:59:58.546814   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I0213 15:59:58.633701   10919 logs.go:123] Gathering logs for etcd [33e10bdbe674] ...
	I0213 15:59:58.633716   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 33e10bdbe674"
	I0213 16:00:01.157306   10919 api_server.go:253] Checking apiserver healthz at https://192.169.0.44:8444/healthz ...
	I0213 16:00:06.158921   10919 api_server.go:269] stopped: https://192.169.0.44:8444/healthz: Get "https://192.169.0.44:8444/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I0213 16:00:06.159044   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-apiserver --format={{.ID}}
	I0213 16:00:06.173328   10919 logs.go:276] 2 containers: [67b5fd979937 69aac7de960e]
	I0213 16:00:06.173405   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_etcd --format={{.ID}}
	I0213 16:00:06.186401   10919 logs.go:276] 2 containers: [8e7930e1af27 33e10bdbe674]
	I0213 16:00:06.186471   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_coredns --format={{.ID}}
	I0213 16:00:06.198890   10919 logs.go:276] 2 containers: [dde67a3d6462 cf10ab2a5821]
	I0213 16:00:06.198968   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-scheduler --format={{.ID}}
	I0213 16:00:06.211942   10919 logs.go:276] 2 containers: [4a32e0095c50 7516f7e0a7b3]
	I0213 16:00:06.212029   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-proxy --format={{.ID}}
	I0213 16:00:06.225464   10919 logs.go:276] 2 containers: [3a2c33446232 166517917ebe]
	I0213 16:00:06.225544   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-controller-manager --format={{.ID}}
	I0213 16:00:06.243552   10919 logs.go:276] 2 containers: [b2bc8bfc1796 f753753d8cc1]
	I0213 16:00:06.243626   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kindnet --format={{.ID}}
	I0213 16:00:06.257195   10919 logs.go:276] 0 containers: []
	W0213 16:00:06.257208   10919 logs.go:278] No container was found matching "kindnet"
	I0213 16:00:06.257273   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kubernetes-dashboard --format={{.ID}}
	I0213 16:00:06.270368   10919 logs.go:276] 1 containers: [9ad73b9e5c82]
	I0213 16:00:06.270440   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_storage-provisioner --format={{.ID}}
	I0213 16:00:06.283964   10919 logs.go:276] 2 containers: [35631bb46ae0 d2fb13268178]
	I0213 16:00:06.283981   10919 logs.go:123] Gathering logs for kube-scheduler [7516f7e0a7b3] ...
	I0213 16:00:06.283988   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 7516f7e0a7b3"
	I0213 16:00:06.304118   10919 logs.go:123] Gathering logs for kube-proxy [3a2c33446232] ...
	I0213 16:00:06.304133   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3a2c33446232"
	I0213 16:00:06.319033   10919 logs.go:123] Gathering logs for storage-provisioner [d2fb13268178] ...
	I0213 16:00:06.319047   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 d2fb13268178"
	I0213 16:00:06.338845   10919 logs.go:123] Gathering logs for dmesg ...
	I0213 16:00:06.338858   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I0213 16:00:06.347987   10919 logs.go:123] Gathering logs for describe nodes ...
	I0213 16:00:06.347999   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I0213 16:00:06.429064   10919 logs.go:123] Gathering logs for kube-apiserver [69aac7de960e] ...
	I0213 16:00:06.429079   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 69aac7de960e"
	I0213 16:00:06.463639   10919 logs.go:123] Gathering logs for kube-proxy [166517917ebe] ...
	I0213 16:00:06.463652   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 166517917ebe"
	I0213 16:00:06.479527   10919 logs.go:123] Gathering logs for kube-controller-manager [b2bc8bfc1796] ...
	I0213 16:00:06.479542   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 b2bc8bfc1796"
	I0213 16:00:06.511584   10919 logs.go:123] Gathering logs for kubernetes-dashboard [9ad73b9e5c82] ...
	I0213 16:00:06.511598   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 9ad73b9e5c82"
	I0213 16:00:06.527654   10919 logs.go:123] Gathering logs for coredns [dde67a3d6462] ...
	I0213 16:00:06.527669   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 dde67a3d6462"
	I0213 16:00:06.543505   10919 logs.go:123] Gathering logs for coredns [cf10ab2a5821] ...
	I0213 16:00:06.543519   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 cf10ab2a5821"
	I0213 16:00:06.558298   10919 logs.go:123] Gathering logs for kube-scheduler [4a32e0095c50] ...
	I0213 16:00:06.558312   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 4a32e0095c50"
	I0213 16:00:06.573733   10919 logs.go:123] Gathering logs for kube-controller-manager [f753753d8cc1] ...
	I0213 16:00:06.573747   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 f753753d8cc1"
	I0213 16:00:06.602373   10919 logs.go:123] Gathering logs for storage-provisioner [35631bb46ae0] ...
	I0213 16:00:06.602387   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 35631bb46ae0"
	I0213 16:00:06.622084   10919 logs.go:123] Gathering logs for kubelet ...
	I0213 16:00:06.622102   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I0213 16:00:06.671201   10919 logs.go:123] Gathering logs for kube-apiserver [67b5fd979937] ...
	I0213 16:00:06.671217   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 67b5fd979937"
	I0213 16:00:06.694199   10919 logs.go:123] Gathering logs for etcd [33e10bdbe674] ...
	I0213 16:00:06.694213   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 33e10bdbe674"
	I0213 16:00:06.713350   10919 logs.go:123] Gathering logs for container status ...
	I0213 16:00:06.713365   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I0213 16:00:06.766151   10919 logs.go:123] Gathering logs for etcd [8e7930e1af27] ...
	I0213 16:00:06.766165   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 8e7930e1af27"
	I0213 16:00:06.784659   10919 logs.go:123] Gathering logs for Docker ...
	I0213 16:00:06.784674   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u docker -u cri-docker -n 400"
	I0213 16:00:09.324344   10919 api_server.go:253] Checking apiserver healthz at https://192.169.0.44:8444/healthz ...
	I0213 16:00:14.325187   10919 api_server.go:269] stopped: https://192.169.0.44:8444/healthz: Get "https://192.169.0.44:8444/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I0213 16:00:14.325327   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-apiserver --format={{.ID}}
	I0213 16:00:14.340650   10919 logs.go:276] 2 containers: [67b5fd979937 69aac7de960e]
	I0213 16:00:14.340725   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_etcd --format={{.ID}}
	I0213 16:00:14.354479   10919 logs.go:276] 2 containers: [8e7930e1af27 33e10bdbe674]
	I0213 16:00:14.354551   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_coredns --format={{.ID}}
	I0213 16:00:14.367494   10919 logs.go:276] 2 containers: [dde67a3d6462 cf10ab2a5821]
	I0213 16:00:14.367569   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-scheduler --format={{.ID}}
	I0213 16:00:14.381515   10919 logs.go:276] 2 containers: [4a32e0095c50 7516f7e0a7b3]
	I0213 16:00:14.381590   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-proxy --format={{.ID}}
	I0213 16:00:14.399300   10919 logs.go:276] 2 containers: [3a2c33446232 166517917ebe]
	I0213 16:00:14.399378   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-controller-manager --format={{.ID}}
	I0213 16:00:14.412334   10919 logs.go:276] 2 containers: [b2bc8bfc1796 f753753d8cc1]
	I0213 16:00:14.412410   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kindnet --format={{.ID}}
	I0213 16:00:14.425303   10919 logs.go:276] 0 containers: []
	W0213 16:00:14.425316   10919 logs.go:278] No container was found matching "kindnet"
	I0213 16:00:14.425376   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_storage-provisioner --format={{.ID}}
	I0213 16:00:14.439051   10919 logs.go:276] 2 containers: [35631bb46ae0 d2fb13268178]
	I0213 16:00:14.439125   10919 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kubernetes-dashboard --format={{.ID}}
	I0213 16:00:14.452879   10919 logs.go:276] 1 containers: [9ad73b9e5c82]
	I0213 16:00:14.452898   10919 logs.go:123] Gathering logs for kube-apiserver [69aac7de960e] ...
	I0213 16:00:14.452905   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 69aac7de960e"
	I0213 16:00:14.486017   10919 logs.go:123] Gathering logs for etcd [8e7930e1af27] ...
	I0213 16:00:14.486032   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 8e7930e1af27"
	I0213 16:00:14.508992   10919 logs.go:123] Gathering logs for kube-proxy [166517917ebe] ...
	I0213 16:00:14.509007   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 166517917ebe"
	I0213 16:00:14.524204   10919 logs.go:123] Gathering logs for kubernetes-dashboard [9ad73b9e5c82] ...
	I0213 16:00:14.524218   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 9ad73b9e5c82"
	I0213 16:00:14.540398   10919 logs.go:123] Gathering logs for container status ...
	I0213 16:00:14.540414   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I0213 16:00:14.586321   10919 logs.go:123] Gathering logs for kubelet ...
	I0213 16:00:14.586338   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I0213 16:00:14.635342   10919 logs.go:123] Gathering logs for dmesg ...
	I0213 16:00:14.635359   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I0213 16:00:14.645134   10919 logs.go:123] Gathering logs for describe nodes ...
	I0213 16:00:14.645145   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I0213 16:00:14.727827   10919 logs.go:123] Gathering logs for kube-scheduler [7516f7e0a7b3] ...
	I0213 16:00:14.727841   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 7516f7e0a7b3"
	I0213 16:00:14.754543   10919 logs.go:123] Gathering logs for kube-controller-manager [b2bc8bfc1796] ...
	I0213 16:00:14.754557   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 b2bc8bfc1796"
	I0213 16:00:14.781912   10919 logs.go:123] Gathering logs for kube-controller-manager [f753753d8cc1] ...
	I0213 16:00:14.781926   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 f753753d8cc1"
	I0213 16:00:14.806969   10919 logs.go:123] Gathering logs for Docker ...
	I0213 16:00:14.806982   10919 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u docker -u cri-docker -n 400"
	I0213 16:00:14.844711   10919 logs.go:123] Gathering logs for kube-apiserver [67b5fd979937] ...
	I0213 16:00:14.844726   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 67b5fd979937"
	I0213 16:00:14.867276   10919 logs.go:123] Gathering logs for coredns [dde67a3d6462] ...
	I0213 16:00:14.867290   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 dde67a3d6462"
	I0213 16:00:14.882034   10919 logs.go:123] Gathering logs for coredns [cf10ab2a5821] ...
	I0213 16:00:14.882051   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 cf10ab2a5821"
	I0213 16:00:14.896809   10919 logs.go:123] Gathering logs for kube-proxy [3a2c33446232] ...
	I0213 16:00:14.896828   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3a2c33446232"
	I0213 16:00:14.912066   10919 logs.go:123] Gathering logs for etcd [33e10bdbe674] ...
	I0213 16:00:14.912081   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 33e10bdbe674"
	I0213 16:00:14.935538   10919 logs.go:123] Gathering logs for kube-scheduler [4a32e0095c50] ...
	I0213 16:00:14.935551   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 4a32e0095c50"
	I0213 16:00:14.951046   10919 logs.go:123] Gathering logs for storage-provisioner [35631bb46ae0] ...
	I0213 16:00:14.951059   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 35631bb46ae0"
	I0213 16:00:14.965671   10919 logs.go:123] Gathering logs for storage-provisioner [d2fb13268178] ...
	I0213 16:00:14.965685   10919 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 d2fb13268178"
	I0213 16:00:17.481145   10919 api_server.go:253] Checking apiserver healthz at https://192.169.0.44:8444/healthz ...
	I0213 16:00:22.481742   10919 api_server.go:269] stopped: https://192.169.0.44:8444/healthz: Get "https://192.169.0.44:8444/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I0213 16:00:22.503750   10919 out.go:177] 
	W0213 16:00:22.524657   10919 out.go:239] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: wait for healthy API server: apiserver healthz never reported healthy: cluster wait timed out during healthz check
	X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: wait for healthy API server: apiserver healthz never reported healthy: cluster wait timed out during healthz check
	W0213 16:00:22.524675   10919 out.go:239] * 
	* 
	W0213 16:00:22.525784   10919 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0213 16:00:22.603612   10919 out.go:177] 

                                                
                                                
** /stderr **
start_stop_delete_test.go:259: failed to start minikube post-stop. args "out/minikube-darwin-amd64 start -p default-k8s-diff-port-603000 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.28.4": exit status 80
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-diff-port-603000 -n default-k8s-diff-port-603000
E0213 16:00:24.488409    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/enable-default-cni-599000/client.crt: no such file or directory
E0213 16:00:25.212602    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/flannel-599000/client.crt: no such file or directory
E0213 16:00:26.510815    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/auto-599000/client.crt: no such file or directory
E0213 16:00:46.811257    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kindnet-599000/client.crt: no such file or directory
E0213 16:01:07.850914    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/bridge-599000/client.crt: no such file or directory
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-diff-port-603000 -n default-k8s-diff-port-603000: exit status 3 (1m15.090431593s)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E0213 16:01:37.817996   11103 status.go:376] failed to get storage capacity of /var: NewSession: new client: new client: dial tcp 192.169.0.44:22: connect: operation timed out
	E0213 16:01:37.818019   11103 status.go:249] status error: NewSession: new client: new client: dial tcp 192.169.0.44:22: connect: operation timed out

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 3 (may be ok)
helpers_test.go:241: "default-k8s-diff-port-603000" host is not running, skipping log retrieval (state="Error")
--- FAIL: TestStartStop/group/default-k8s-diff-port/serial/SecondStart (478.67s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/FirstStart (802.03s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p newest-cni-173000 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=10.42.0.0/16 --driver=hyperkit  --kubernetes-version=v1.29.0-rc.2
E0213 15:58:39.248836    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/no-preload-355000/client.crt: no such file or directory
E0213 15:58:39.975823    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/calico-599000/client.crt: no such file or directory
E0213 15:58:48.190480    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/custom-flannel-599000/client.crt: no such file or directory
E0213 15:58:48.783175    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
E0213 15:58:50.869778    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/false-599000/client.crt: no such file or directory
E0213 15:58:53.834990    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/old-k8s-version-481000/client.crt: no such file or directory
E0213 15:59:01.432820    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/enable-default-cni-599000/client.crt: no such file or directory
E0213 15:59:04.012907    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
E0213 15:59:05.728870    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
E0213 15:59:06.942038    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/no-preload-355000/client.crt: no such file or directory
E0213 15:59:21.520250    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/old-k8s-version-481000/client.crt: no such file or directory
E0213 15:59:35.959241    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
E0213 16:00:00.036463    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
E0213 16:00:13.917586    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/false-599000/client.crt: no such file or directory
start_stop_delete_test.go:186: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p newest-cni-173000 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=10.42.0.0/16 --driver=hyperkit  --kubernetes-version=v1.29.0-rc.2: exit status 52 (12m6.913502357s)

                                                
                                                
-- stdout --
	* [newest-cni-173000] minikube v1.32.0 on Darwin 14.3.1
	  - MINIKUBE_LOCATION=18169
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18169-2790/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18169-2790/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on user configuration
	* Starting control plane node newest-cni-173000 in cluster newest-cni-173000
	* Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	* Deleting "newest-cni-173000" in hyperkit ...
	* Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0213 15:58:17.741502   11066 out.go:291] Setting OutFile to fd 1 ...
	I0213 15:58:17.741672   11066 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0213 15:58:17.741678   11066 out.go:304] Setting ErrFile to fd 2...
	I0213 15:58:17.741683   11066 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0213 15:58:17.741861   11066 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18169-2790/.minikube/bin
	I0213 15:58:17.743296   11066 out.go:298] Setting JSON to false
	I0213 15:58:17.765928   11066 start.go:128] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":4871,"bootTime":1707863826,"procs":473,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.3.1","kernelVersion":"23.3.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0213 15:58:17.766042   11066 start.go:136] gopshost.Virtualization returned error: not implemented yet
	I0213 15:58:17.786210   11066 out.go:177] * [newest-cni-173000] minikube v1.32.0 on Darwin 14.3.1
	I0213 15:58:17.809141   11066 out.go:177]   - MINIKUBE_LOCATION=18169
	I0213 15:58:17.809211   11066 notify.go:220] Checking for updates...
	I0213 15:58:17.851170   11066 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/18169-2790/kubeconfig
	I0213 15:58:17.872017   11066 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0213 15:58:17.914129   11066 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0213 15:58:17.935087   11066 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18169-2790/.minikube
	I0213 15:58:17.956272   11066 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0213 15:58:17.977643   11066 config.go:182] Loaded profile config "default-k8s-diff-port-603000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
	I0213 15:58:17.977745   11066 driver.go:392] Setting default libvirt URI to qemu:///system
	I0213 15:58:18.005963   11066 out.go:177] * Using the hyperkit driver based on user configuration
	I0213 15:58:18.048220   11066 start.go:298] selected driver: hyperkit
	I0213 15:58:18.048234   11066 start.go:902] validating driver "hyperkit" against <nil>
	I0213 15:58:18.048245   11066 start.go:913] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0213 15:58:18.051183   11066 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0213 15:58:18.051294   11066 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/18169-2790/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0213 15:58:18.059032   11066 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.32.0
	I0213 15:58:18.062771   11066 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 15:58:18.062794   11066 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0213 15:58:18.062826   11066 start_flags.go:307] no existing cluster config was found, will generate one from the flags 
	W0213 15:58:18.062865   11066 out.go:239] ! With --network-plugin=cni, you will need to provide your own CNI. See --cni flag as a user-friendly alternative
	! With --network-plugin=cni, you will need to provide your own CNI. See --cni flag as a user-friendly alternative
	I0213 15:58:18.063498   11066 start_flags.go:946] Waiting for components: map[apiserver:true apps_running:false default_sa:true extra:false kubelet:false node_ready:false system_pods:true]
	I0213 15:58:18.063568   11066 cni.go:84] Creating CNI manager for ""
	I0213 15:58:18.063581   11066 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I0213 15:58:18.063591   11066 start_flags.go:316] Found "bridge CNI" CNI - setting NetworkPlugin=cni
	I0213 15:58:18.063600   11066 start_flags.go:321] config:
	{Name:newest-cni-173000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1704759386-17866@sha256:8c3c33047f9bc285e1f5f2a5aa14744a2fe04c58478f02f77b06169dea8dd3f0 Memory:2200 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.29.0-rc.2 ClusterName:newest-cni-173000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerR
untime:docker CRISocket: NetworkPlugin:cni FeatureGates:ServerSideApply=true ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:kubeadm Key:pod-network-cidr Value:10.42.0.0/16}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:false default_sa:true extra:false kubelet:false node_ready:false system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP:
SSHAuthSock: SSHAgentPID:0 GPUs:}
	I0213 15:58:18.063751   11066 iso.go:125] acquiring lock: {Name:mk11c32e346f5bc1f067dee24ee83d9969db3d82 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0213 15:58:18.126182   11066 out.go:177] * Starting control plane node newest-cni-173000 in cluster newest-cni-173000
	I0213 15:58:18.147040   11066 preload.go:132] Checking if preload exists for k8s version v1.29.0-rc.2 and runtime docker
	I0213 15:58:18.147073   11066 preload.go:148] Found local preload: /Users/jenkins/minikube-integration/18169-2790/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.29.0-rc.2-docker-overlay2-amd64.tar.lz4
	I0213 15:58:18.147090   11066 cache.go:56] Caching tarball of preloaded images
	I0213 15:58:18.147738   11066 preload.go:174] Found /Users/jenkins/minikube-integration/18169-2790/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.29.0-rc.2-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0213 15:58:18.147749   11066 cache.go:59] Finished verifying existence of preloaded tar for  v1.29.0-rc.2 on docker
	I0213 15:58:18.147834   11066 profile.go:148] Saving config to /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/newest-cni-173000/config.json ...
	I0213 15:58:18.147853   11066 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/newest-cni-173000/config.json: {Name:mkfb096d2525127c1037400e6d21127521242071 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0213 15:58:18.148665   11066 start.go:365] acquiring machines lock for newest-cni-173000: {Name:mke947868f35224fa4aab1d5f0a66de1e12a8270 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0213 15:58:18.148729   11066 start.go:369] acquired machines lock for "newest-cni-173000" in 50.148µs
	I0213 15:58:18.148756   11066 start.go:93] Provisioning new machine with config: &{Name:newest-cni-173000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/17866/minikube-v1.32.1-1703784139-17866-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1704759386-17866@sha256:8c3c33047f9bc285e1f5f2a5aa14744a2fe04c58478f02f77b06169dea8dd3f0 Memory:2200 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 Kubernet
esConfig:{KubernetesVersion:v1.29.0-rc.2 ClusterName:newest-cni-173000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates:ServerSideApply=true ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:kubeadm Key:pod-network-cidr Value:10.42.0.0/16}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.29.0-rc.2 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:false default_sa:true extra:false kubelet:false node_ready:false system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false Mount
String:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs:} &{Name: IP: Port:8443 KubernetesVersion:v1.29.0-rc.2 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0213 15:58:18.148830   11066 start.go:125] createHost starting for "" (driver="hyperkit")
	I0213 15:58:18.191334   11066 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0213 15:58:18.191507   11066 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 15:58:18.191557   11066 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 15:58:18.200163   11066 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:57537
	I0213 15:58:18.200529   11066 main.go:141] libmachine: () Calling .GetVersion
	I0213 15:58:18.201013   11066 main.go:141] libmachine: Using API Version  1
	I0213 15:58:18.201026   11066 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 15:58:18.201221   11066 main.go:141] libmachine: () Calling .GetMachineName
	I0213 15:58:18.201337   11066 main.go:141] libmachine: (newest-cni-173000) Calling .GetMachineName
	I0213 15:58:18.201424   11066 main.go:141] libmachine: (newest-cni-173000) Calling .DriverName
	I0213 15:58:18.201535   11066 start.go:159] libmachine.API.Create for "newest-cni-173000" (driver="hyperkit")
	I0213 15:58:18.201556   11066 client.go:168] LocalClient.Create starting
	I0213 15:58:18.201595   11066 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18169-2790/.minikube/certs/ca.pem
	I0213 15:58:18.201642   11066 main.go:141] libmachine: Decoding PEM data...
	I0213 15:58:18.201658   11066 main.go:141] libmachine: Parsing certificate...
	I0213 15:58:18.201722   11066 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18169-2790/.minikube/certs/cert.pem
	I0213 15:58:18.201758   11066 main.go:141] libmachine: Decoding PEM data...
	I0213 15:58:18.201772   11066 main.go:141] libmachine: Parsing certificate...
	I0213 15:58:18.201789   11066 main.go:141] libmachine: Running pre-create checks...
	I0213 15:58:18.201798   11066 main.go:141] libmachine: (newest-cni-173000) Calling .PreCreateCheck
	I0213 15:58:18.201881   11066 main.go:141] libmachine: (newest-cni-173000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 15:58:18.202079   11066 main.go:141] libmachine: (newest-cni-173000) Calling .GetConfigRaw
	I0213 15:58:18.202473   11066 main.go:141] libmachine: Creating machine...
	I0213 15:58:18.202482   11066 main.go:141] libmachine: (newest-cni-173000) Calling .Create
	I0213 15:58:18.202552   11066 main.go:141] libmachine: (newest-cni-173000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 15:58:18.202680   11066 main.go:141] libmachine: (newest-cni-173000) DBG | I0213 15:58:18.202546   11074 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/18169-2790/.minikube
	I0213 15:58:18.202724   11066 main.go:141] libmachine: (newest-cni-173000) Downloading /Users/jenkins/minikube-integration/18169-2790/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/18169-2790/.minikube/cache/iso/amd64/minikube-v1.32.1-1703784139-17866-amd64.iso...
	I0213 15:58:18.379293   11066 main.go:141] libmachine: (newest-cni-173000) DBG | I0213 15:58:18.379229   11074 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/id_rsa...
	I0213 15:58:18.523194   11066 main.go:141] libmachine: (newest-cni-173000) DBG | I0213 15:58:18.523128   11074 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/newest-cni-173000.rawdisk...
	I0213 15:58:18.523213   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Writing magic tar header
	I0213 15:58:18.523248   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Writing SSH key tar header
	I0213 15:58:18.523584   11066 main.go:141] libmachine: (newest-cni-173000) DBG | I0213 15:58:18.523542   11074 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000 ...
	I0213 15:58:18.855863   11066 main.go:141] libmachine: (newest-cni-173000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 15:58:18.855884   11066 main.go:141] libmachine: (newest-cni-173000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/hyperkit.pid
	I0213 15:58:18.855914   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Using UUID f27b1faf-b740-488e-b2c7-e7f4b5a579e3
	I0213 15:58:18.905857   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Generated MAC 9e:48:95:9f:7c:44
	I0213 15:58:18.905879   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=newest-cni-173000
	I0213 15:58:18.905915   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 15:58:18 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"f27b1faf-b740-488e-b2c7-e7f4b5a579e3", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0000ea660)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.
Process)(nil)}
	I0213 15:58:18.905944   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 15:58:18 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"f27b1faf-b740-488e-b2c7-e7f4b5a579e3", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0000ea660)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.
Process)(nil)}
	I0213 15:58:18.906002   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 15:58:18 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "f27b1faf-b740-488e-b2c7-e7f4b5a579e3", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/newest-cni-173000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/tty,log=/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/bzimage,/Users/jenkins/minikube-integrat
ion/18169-2790/.minikube/machines/newest-cni-173000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=newest-cni-173000"}
	I0213 15:58:18.906044   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 15:58:18 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U f27b1faf-b740-488e-b2c7-e7f4b5a579e3 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/newest-cni-173000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/tty,log=/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/console-ring -f kexec,/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/bzimage,/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/initrd,ear
lyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=newest-cni-173000"
	I0213 15:58:18.906060   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 15:58:18 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0213 15:58:18.908869   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 15:58:18 DEBUG: hyperkit: Pid is 11075
	I0213 15:58:18.909350   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Attempt 0
	I0213 15:58:18.909364   11066 main.go:141] libmachine: (newest-cni-173000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 15:58:18.909459   11066 main.go:141] libmachine: (newest-cni-173000) DBG | hyperkit pid from json: 11075
	I0213 15:58:18.910426   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Searching for 9e:48:95:9f:7c:44 in /var/db/dhcpd_leases ...
	I0213 15:58:18.910526   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Found 43 entries in /var/db/dhcpd_leases!
	I0213 15:58:18.910540   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.44 HWAddress:ba:b2:e8:3b:2f:ed ID:1,ba:b2:e8:3b:2f:ed Lease:0x65cd528c}
	I0213 15:58:18.910564   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.43 HWAddress:22:34:c6:19:4c:2e ID:1,22:34:c6:19:4c:2e Lease:0x65cd524f}
	I0213 15:58:18.910583   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.42 HWAddress:ce:91:bb:8e:b0:c5 ID:1,ce:91:bb:8e:b0:c5 Lease:0x65cd504a}
	I0213 15:58:18.910599   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.41 HWAddress:d2:c4:90:cf:75:d2 ID:1,d2:c4:90:cf:75:d2 Lease:0x65cd5059}
	I0213 15:58:18.910615   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.40 HWAddress:a2:fe:3f:a3:bd:5d ID:1,a2:fe:3f:a3:bd:5d Lease:0x65cd4f9a}
	I0213 15:58:18.910630   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.39 HWAddress:ea:4d:7a:fd:a5:26 ID:1,ea:4d:7a:fd:a5:26 Lease:0x65cd4f44}
	I0213 15:58:18.910644   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.38 HWAddress:ce:fc:41:79:7e:81 ID:1,ce:fc:41:79:7e:81 Lease:0x65cd4f35}
	I0213 15:58:18.910655   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.37 HWAddress:8e:65:f:4b:58:d4 ID:1,8e:65:f:4b:58:d4 Lease:0x65cd4eeb}
	I0213 15:58:18.910663   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.36 HWAddress:12:37:2b:87:75:f2 ID:1,12:37:2b:87:75:f2 Lease:0x65cd4edf}
	I0213 15:58:18.910684   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.35 HWAddress:42:5c:64:90:9a:96 ID:1,42:5c:64:90:9a:96 Lease:0x65cd4e83}
	I0213 15:58:18.910701   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.34 HWAddress:4e:43:dd:6e:d0:a4 ID:1,4e:43:dd:6e:d0:a4 Lease:0x65cd4e67}
	I0213 15:58:18.910725   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.33 HWAddress:c2:bc:f:d2:56:37 ID:1,c2:bc:f:d2:56:37 Lease:0x65cd4e21}
	I0213 15:58:18.910739   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.32 HWAddress:82:56:84:93:20:76 ID:1,82:56:84:93:20:76 Lease:0x65cd4e15}
	I0213 15:58:18.910752   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.31 HWAddress:86:68:12:94:e3:53 ID:1,86:68:12:94:e3:53 Lease:0x65cbfc90}
	I0213 15:58:18.910763   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.30 HWAddress:5a:9f:f7:d:31:86 ID:1,5a:9f:f7:d:31:86 Lease:0x65cbfc64}
	I0213 15:58:18.910776   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.29 HWAddress:2:da:94:be:eb:78 ID:1,2:da:94:be:eb:78 Lease:0x65cd4dad}
	I0213 15:58:18.910788   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.28 HWAddress:52:b1:93:68:d8:c9 ID:1,52:b1:93:68:d8:c9 Lease:0x65cd4d7e}
	I0213 15:58:18.910807   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.27 HWAddress:5e:f1:37:cb:1:ab ID:1,5e:f1:37:cb:1:ab Lease:0x65cd4d6f}
	I0213 15:58:18.910823   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.26 HWAddress:d2:58:e8:51:28:f4 ID:1,d2:58:e8:51:28:f4 Lease:0x65cd4cd6}
	I0213 15:58:18.910841   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:e2:b0:82:5e:37:7f ID:1,e2:b0:82:5e:37:7f Lease:0x65cd4c85}
	I0213 15:58:18.910857   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:1e:1c:11:76:5:69 ID:1,1e:1c:11:76:5:69 Lease:0x65cd4c55}
	I0213 15:58:18.910870   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:82:d6:3f:b2:ee:db ID:1,82:d6:3f:b2:ee:db Lease:0x65cbfaca}
	I0213 15:58:18.910921   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:36:e0:4a:c7:a:0 ID:1,36:e0:4a:c7:a:0 Lease:0x65cd4c07}
	I0213 15:58:18.910944   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:6:b2:24:3d:c0:93 ID:1,6:b2:24:3d:c0:93 Lease:0x65cd4be8}
	I0213 15:58:18.910962   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:56:5d:32:d9:58:54 ID:1,56:5d:32:d9:58:54 Lease:0x65cd4bce}
	I0213 15:58:18.910979   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:fa:8e:7c:83:4e:d9 ID:1,fa:8e:7c:83:4e:d9 Lease:0x65cd4b56}
	I0213 15:58:18.910997   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:2a:8f:bb:99:b7:16 ID:1,2a:8f:bb:99:b7:16 Lease:0x65cd4ae3}
	I0213 15:58:18.911020   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:1a:a1:86:8f:2b:ed ID:1,1a:a1:86:8f:2b:ed Lease:0x65cd4aaf}
	I0213 15:58:18.911034   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:42:72:7:ca:56:e ID:1,42:72:7:ca:56:e Lease:0x65cbf89a}
	I0213 15:58:18.911053   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:6e:db:44:6f:36:dd ID:1,6e:db:44:6f:36:dd Lease:0x65cbf810}
	I0213 15:58:18.911069   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6a:bc:38:de:80:fc ID:1,6a:bc:38:de:80:fc Lease:0x65cd49df}
	I0213 15:58:18.911087   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:94:35:f5:ee:46 ID:1,52:94:35:f5:ee:46 Lease:0x65cd49ab}
	I0213 15:58:18.911104   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:da:34:1f:c5:ee:5 ID:1,da:34:1f:c5:ee:5 Lease:0x65cbf65c}
	I0213 15:58:18.911121   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:fa:67:c7:f6:f1:e5 ID:1,fa:67:c7:f6:f1:e5 Lease:0x65cbf645}
	I0213 15:58:18.911136   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:26:3b:e2:e2:a5:a7 ID:1,26:3b:e2:e2:a5:a7 Lease:0x65cd477c}
	I0213 15:58:18.911163   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:d2:14:90:81:4:9d ID:1,d2:14:90:81:4:9d Lease:0x65cd4756}
	I0213 15:58:18.911184   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:7a:28:c5:9e:43:85 ID:1,7a:28:c5:9e:43:85 Lease:0x65cd4719}
	I0213 15:58:18.911199   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:4a:cd:b6:27:72:f6 ID:1,4a:cd:b6:27:72:f6 Lease:0x65cd4693}
	I0213 15:58:18.911214   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:66:d4:88:4d:a4:a6 ID:1,66:d4:88:4d:a4:a6 Lease:0x65cd464d}
	I0213 15:58:18.911228   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:22:72:80:65:85:f8 ID:1,22:72:80:65:85:f8 Lease:0x65cd4555}
	I0213 15:58:18.911241   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:4a:db:1c:ff:88:80 ID:1,4a:db:1c:ff:88:80 Lease:0x65cd4526}
	I0213 15:58:18.911265   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:72:e4:3c:27:f8:90 ID:1,72:e4:3c:27:f8:90 Lease:0x65cd43c3}
	I0213 15:58:18.911282   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name: IPAddress:192.169.0.2 HWAddress:a2:7f:c4:fc:f5:19 ID:1,a2:7f:c4:fc:f5:19 Lease:0x65cd40a2}
	I0213 15:58:18.916296   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 15:58:18 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0213 15:58:18.925160   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 15:58:18 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0213 15:58:18.925965   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 15:58:18 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0213 15:58:18.925996   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 15:58:18 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0213 15:58:18.926016   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 15:58:18 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0213 15:58:18.926033   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 15:58:18 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0213 15:58:19.297847   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 15:58:19 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0213 15:58:19.297864   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 15:58:19 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0213 15:58:19.401816   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 15:58:19 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0213 15:58:19.401835   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 15:58:19 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0213 15:58:19.401847   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 15:58:19 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0213 15:58:19.401863   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 15:58:19 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0213 15:58:19.402744   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 15:58:19 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0213 15:58:19.402767   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 15:58:19 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0213 15:58:20.912208   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Attempt 1
	I0213 15:58:20.912225   11066 main.go:141] libmachine: (newest-cni-173000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 15:58:20.912322   11066 main.go:141] libmachine: (newest-cni-173000) DBG | hyperkit pid from json: 11075
	I0213 15:58:20.913221   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Searching for 9e:48:95:9f:7c:44 in /var/db/dhcpd_leases ...
	I0213 15:58:20.913310   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Found 43 entries in /var/db/dhcpd_leases!
	I0213 15:58:20.913326   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.44 HWAddress:ba:b2:e8:3b:2f:ed ID:1,ba:b2:e8:3b:2f:ed Lease:0x65cd528c}
	I0213 15:58:20.913347   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.43 HWAddress:22:34:c6:19:4c:2e ID:1,22:34:c6:19:4c:2e Lease:0x65cd524f}
	I0213 15:58:20.913373   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.42 HWAddress:ce:91:bb:8e:b0:c5 ID:1,ce:91:bb:8e:b0:c5 Lease:0x65cd504a}
	I0213 15:58:20.913386   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.41 HWAddress:d2:c4:90:cf:75:d2 ID:1,d2:c4:90:cf:75:d2 Lease:0x65cd5059}
	I0213 15:58:20.913400   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.40 HWAddress:a2:fe:3f:a3:bd:5d ID:1,a2:fe:3f:a3:bd:5d Lease:0x65cd4f9a}
	I0213 15:58:20.913414   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.39 HWAddress:ea:4d:7a:fd:a5:26 ID:1,ea:4d:7a:fd:a5:26 Lease:0x65cd4f44}
	I0213 15:58:20.913426   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.38 HWAddress:ce:fc:41:79:7e:81 ID:1,ce:fc:41:79:7e:81 Lease:0x65cd4f35}
	I0213 15:58:20.913436   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.37 HWAddress:8e:65:f:4b:58:d4 ID:1,8e:65:f:4b:58:d4 Lease:0x65cd4eeb}
	I0213 15:58:20.913448   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.36 HWAddress:12:37:2b:87:75:f2 ID:1,12:37:2b:87:75:f2 Lease:0x65cd4edf}
	I0213 15:58:20.913459   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.35 HWAddress:42:5c:64:90:9a:96 ID:1,42:5c:64:90:9a:96 Lease:0x65cd4e83}
	I0213 15:58:20.913477   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.34 HWAddress:4e:43:dd:6e:d0:a4 ID:1,4e:43:dd:6e:d0:a4 Lease:0x65cd4e67}
	I0213 15:58:20.913492   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.33 HWAddress:c2:bc:f:d2:56:37 ID:1,c2:bc:f:d2:56:37 Lease:0x65cd4e21}
	I0213 15:58:20.913505   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.32 HWAddress:82:56:84:93:20:76 ID:1,82:56:84:93:20:76 Lease:0x65cd4e15}
	I0213 15:58:20.913514   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.31 HWAddress:86:68:12:94:e3:53 ID:1,86:68:12:94:e3:53 Lease:0x65cbfc90}
	I0213 15:58:20.913521   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.30 HWAddress:5a:9f:f7:d:31:86 ID:1,5a:9f:f7:d:31:86 Lease:0x65cbfc64}
	I0213 15:58:20.913537   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.29 HWAddress:2:da:94:be:eb:78 ID:1,2:da:94:be:eb:78 Lease:0x65cd4dad}
	I0213 15:58:20.913544   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.28 HWAddress:52:b1:93:68:d8:c9 ID:1,52:b1:93:68:d8:c9 Lease:0x65cd4d7e}
	I0213 15:58:20.913551   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.27 HWAddress:5e:f1:37:cb:1:ab ID:1,5e:f1:37:cb:1:ab Lease:0x65cd4d6f}
	I0213 15:58:20.913560   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.26 HWAddress:d2:58:e8:51:28:f4 ID:1,d2:58:e8:51:28:f4 Lease:0x65cd4cd6}
	I0213 15:58:20.913574   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:e2:b0:82:5e:37:7f ID:1,e2:b0:82:5e:37:7f Lease:0x65cd4c85}
	I0213 15:58:20.913595   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:1e:1c:11:76:5:69 ID:1,1e:1c:11:76:5:69 Lease:0x65cd4c55}
	I0213 15:58:20.913607   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:82:d6:3f:b2:ee:db ID:1,82:d6:3f:b2:ee:db Lease:0x65cbfaca}
	I0213 15:58:20.913617   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:36:e0:4a:c7:a:0 ID:1,36:e0:4a:c7:a:0 Lease:0x65cd4c07}
	I0213 15:58:20.913624   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:6:b2:24:3d:c0:93 ID:1,6:b2:24:3d:c0:93 Lease:0x65cd4be8}
	I0213 15:58:20.913631   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:56:5d:32:d9:58:54 ID:1,56:5d:32:d9:58:54 Lease:0x65cd4bce}
	I0213 15:58:20.913639   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:fa:8e:7c:83:4e:d9 ID:1,fa:8e:7c:83:4e:d9 Lease:0x65cd4b56}
	I0213 15:58:20.913648   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:2a:8f:bb:99:b7:16 ID:1,2a:8f:bb:99:b7:16 Lease:0x65cd4ae3}
	I0213 15:58:20.913656   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:1a:a1:86:8f:2b:ed ID:1,1a:a1:86:8f:2b:ed Lease:0x65cd4aaf}
	I0213 15:58:20.913663   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:42:72:7:ca:56:e ID:1,42:72:7:ca:56:e Lease:0x65cbf89a}
	I0213 15:58:20.913671   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:6e:db:44:6f:36:dd ID:1,6e:db:44:6f:36:dd Lease:0x65cbf810}
	I0213 15:58:20.913679   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6a:bc:38:de:80:fc ID:1,6a:bc:38:de:80:fc Lease:0x65cd49df}
	I0213 15:58:20.913695   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:94:35:f5:ee:46 ID:1,52:94:35:f5:ee:46 Lease:0x65cd49ab}
	I0213 15:58:20.913703   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:da:34:1f:c5:ee:5 ID:1,da:34:1f:c5:ee:5 Lease:0x65cbf65c}
	I0213 15:58:20.913713   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:fa:67:c7:f6:f1:e5 ID:1,fa:67:c7:f6:f1:e5 Lease:0x65cbf645}
	I0213 15:58:20.913721   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:26:3b:e2:e2:a5:a7 ID:1,26:3b:e2:e2:a5:a7 Lease:0x65cd477c}
	I0213 15:58:20.913730   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:d2:14:90:81:4:9d ID:1,d2:14:90:81:4:9d Lease:0x65cd4756}
	I0213 15:58:20.913739   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:7a:28:c5:9e:43:85 ID:1,7a:28:c5:9e:43:85 Lease:0x65cd4719}
	I0213 15:58:20.913749   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:4a:cd:b6:27:72:f6 ID:1,4a:cd:b6:27:72:f6 Lease:0x65cd4693}
	I0213 15:58:20.913760   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:66:d4:88:4d:a4:a6 ID:1,66:d4:88:4d:a4:a6 Lease:0x65cd464d}
	I0213 15:58:20.913769   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:22:72:80:65:85:f8 ID:1,22:72:80:65:85:f8 Lease:0x65cd4555}
	I0213 15:58:20.913777   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:4a:db:1c:ff:88:80 ID:1,4a:db:1c:ff:88:80 Lease:0x65cd4526}
	I0213 15:58:20.913785   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:72:e4:3c:27:f8:90 ID:1,72:e4:3c:27:f8:90 Lease:0x65cd43c3}
	I0213 15:58:20.913794   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name: IPAddress:192.169.0.2 HWAddress:a2:7f:c4:fc:f5:19 ID:1,a2:7f:c4:fc:f5:19 Lease:0x65cd40a2}
	I0213 15:58:22.915313   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Attempt 2
	I0213 15:58:22.915330   11066 main.go:141] libmachine: (newest-cni-173000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 15:58:22.915417   11066 main.go:141] libmachine: (newest-cni-173000) DBG | hyperkit pid from json: 11075
	I0213 15:58:22.916259   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Searching for 9e:48:95:9f:7c:44 in /var/db/dhcpd_leases ...
	I0213 15:58:22.916341   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Found 43 entries in /var/db/dhcpd_leases!
	I0213 15:58:22.916354   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.44 HWAddress:ba:b2:e8:3b:2f:ed ID:1,ba:b2:e8:3b:2f:ed Lease:0x65cd528c}
	I0213 15:58:22.916369   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.43 HWAddress:22:34:c6:19:4c:2e ID:1,22:34:c6:19:4c:2e Lease:0x65cd524f}
	I0213 15:58:22.916377   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.42 HWAddress:ce:91:bb:8e:b0:c5 ID:1,ce:91:bb:8e:b0:c5 Lease:0x65cd504a}
	I0213 15:58:22.916385   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.41 HWAddress:d2:c4:90:cf:75:d2 ID:1,d2:c4:90:cf:75:d2 Lease:0x65cd5059}
	I0213 15:58:22.916398   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.40 HWAddress:a2:fe:3f:a3:bd:5d ID:1,a2:fe:3f:a3:bd:5d Lease:0x65cd4f9a}
	I0213 15:58:22.916411   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.39 HWAddress:ea:4d:7a:fd:a5:26 ID:1,ea:4d:7a:fd:a5:26 Lease:0x65cd4f44}
	I0213 15:58:22.916421   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.38 HWAddress:ce:fc:41:79:7e:81 ID:1,ce:fc:41:79:7e:81 Lease:0x65cd4f35}
	I0213 15:58:22.916434   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.37 HWAddress:8e:65:f:4b:58:d4 ID:1,8e:65:f:4b:58:d4 Lease:0x65cd4eeb}
	I0213 15:58:22.916442   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.36 HWAddress:12:37:2b:87:75:f2 ID:1,12:37:2b:87:75:f2 Lease:0x65cd4edf}
	I0213 15:58:22.916449   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.35 HWAddress:42:5c:64:90:9a:96 ID:1,42:5c:64:90:9a:96 Lease:0x65cd4e83}
	I0213 15:58:22.916462   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.34 HWAddress:4e:43:dd:6e:d0:a4 ID:1,4e:43:dd:6e:d0:a4 Lease:0x65cd4e67}
	I0213 15:58:22.916469   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.33 HWAddress:c2:bc:f:d2:56:37 ID:1,c2:bc:f:d2:56:37 Lease:0x65cd4e21}
	I0213 15:58:22.916484   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.32 HWAddress:82:56:84:93:20:76 ID:1,82:56:84:93:20:76 Lease:0x65cd4e15}
	I0213 15:58:22.916491   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.31 HWAddress:86:68:12:94:e3:53 ID:1,86:68:12:94:e3:53 Lease:0x65cbfc90}
	I0213 15:58:22.916499   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.30 HWAddress:5a:9f:f7:d:31:86 ID:1,5a:9f:f7:d:31:86 Lease:0x65cbfc64}
	I0213 15:58:22.916509   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.29 HWAddress:2:da:94:be:eb:78 ID:1,2:da:94:be:eb:78 Lease:0x65cd4dad}
	I0213 15:58:22.916519   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.28 HWAddress:52:b1:93:68:d8:c9 ID:1,52:b1:93:68:d8:c9 Lease:0x65cd4d7e}
	I0213 15:58:22.916527   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.27 HWAddress:5e:f1:37:cb:1:ab ID:1,5e:f1:37:cb:1:ab Lease:0x65cd4d6f}
	I0213 15:58:22.916548   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.26 HWAddress:d2:58:e8:51:28:f4 ID:1,d2:58:e8:51:28:f4 Lease:0x65cd4cd6}
	I0213 15:58:22.916559   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:e2:b0:82:5e:37:7f ID:1,e2:b0:82:5e:37:7f Lease:0x65cd4c85}
	I0213 15:58:22.916570   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:1e:1c:11:76:5:69 ID:1,1e:1c:11:76:5:69 Lease:0x65cd4c55}
	I0213 15:58:22.916579   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:82:d6:3f:b2:ee:db ID:1,82:d6:3f:b2:ee:db Lease:0x65cbfaca}
	I0213 15:58:22.916586   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:36:e0:4a:c7:a:0 ID:1,36:e0:4a:c7:a:0 Lease:0x65cd4c07}
	I0213 15:58:22.916595   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:6:b2:24:3d:c0:93 ID:1,6:b2:24:3d:c0:93 Lease:0x65cd4be8}
	I0213 15:58:22.916603   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:56:5d:32:d9:58:54 ID:1,56:5d:32:d9:58:54 Lease:0x65cd4bce}
	I0213 15:58:22.916611   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:fa:8e:7c:83:4e:d9 ID:1,fa:8e:7c:83:4e:d9 Lease:0x65cd4b56}
	I0213 15:58:22.916619   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:2a:8f:bb:99:b7:16 ID:1,2a:8f:bb:99:b7:16 Lease:0x65cd4ae3}
	I0213 15:58:22.916627   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:1a:a1:86:8f:2b:ed ID:1,1a:a1:86:8f:2b:ed Lease:0x65cd4aaf}
	I0213 15:58:22.916637   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:42:72:7:ca:56:e ID:1,42:72:7:ca:56:e Lease:0x65cbf89a}
	I0213 15:58:22.916646   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:6e:db:44:6f:36:dd ID:1,6e:db:44:6f:36:dd Lease:0x65cbf810}
	I0213 15:58:22.916654   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6a:bc:38:de:80:fc ID:1,6a:bc:38:de:80:fc Lease:0x65cd49df}
	I0213 15:58:22.916662   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:94:35:f5:ee:46 ID:1,52:94:35:f5:ee:46 Lease:0x65cd49ab}
	I0213 15:58:22.916670   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:da:34:1f:c5:ee:5 ID:1,da:34:1f:c5:ee:5 Lease:0x65cbf65c}
	I0213 15:58:22.916678   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:fa:67:c7:f6:f1:e5 ID:1,fa:67:c7:f6:f1:e5 Lease:0x65cbf645}
	I0213 15:58:22.916695   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:26:3b:e2:e2:a5:a7 ID:1,26:3b:e2:e2:a5:a7 Lease:0x65cd477c}
	I0213 15:58:22.916706   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:d2:14:90:81:4:9d ID:1,d2:14:90:81:4:9d Lease:0x65cd4756}
	I0213 15:58:22.916715   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:7a:28:c5:9e:43:85 ID:1,7a:28:c5:9e:43:85 Lease:0x65cd4719}
	I0213 15:58:22.916724   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:4a:cd:b6:27:72:f6 ID:1,4a:cd:b6:27:72:f6 Lease:0x65cd4693}
	I0213 15:58:22.916732   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:66:d4:88:4d:a4:a6 ID:1,66:d4:88:4d:a4:a6 Lease:0x65cd464d}
	I0213 15:58:22.916740   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:22:72:80:65:85:f8 ID:1,22:72:80:65:85:f8 Lease:0x65cd4555}
	I0213 15:58:22.916751   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:4a:db:1c:ff:88:80 ID:1,4a:db:1c:ff:88:80 Lease:0x65cd4526}
	I0213 15:58:22.916760   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:72:e4:3c:27:f8:90 ID:1,72:e4:3c:27:f8:90 Lease:0x65cd43c3}
	I0213 15:58:22.916770   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name: IPAddress:192.169.0.2 HWAddress:a2:7f:c4:fc:f5:19 ID:1,a2:7f:c4:fc:f5:19 Lease:0x65cd40a2}
	I0213 15:58:24.367632   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 15:58:24 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0213 15:58:24.367689   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 15:58:24 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0213 15:58:24.367699   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 15:58:24 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0213 15:58:24.916734   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Attempt 3
	I0213 15:58:24.916755   11066 main.go:141] libmachine: (newest-cni-173000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 15:58:24.916844   11066 main.go:141] libmachine: (newest-cni-173000) DBG | hyperkit pid from json: 11075
	I0213 15:58:24.917786   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Searching for 9e:48:95:9f:7c:44 in /var/db/dhcpd_leases ...
	I0213 15:58:24.917869   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Found 43 entries in /var/db/dhcpd_leases!
	I0213 15:58:24.917882   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.44 HWAddress:ba:b2:e8:3b:2f:ed ID:1,ba:b2:e8:3b:2f:ed Lease:0x65cd528c}
	I0213 15:58:24.917896   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.43 HWAddress:22:34:c6:19:4c:2e ID:1,22:34:c6:19:4c:2e Lease:0x65cd524f}
	I0213 15:58:24.917907   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.42 HWAddress:ce:91:bb:8e:b0:c5 ID:1,ce:91:bb:8e:b0:c5 Lease:0x65cd504a}
	I0213 15:58:24.917915   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.41 HWAddress:d2:c4:90:cf:75:d2 ID:1,d2:c4:90:cf:75:d2 Lease:0x65cd5059}
	I0213 15:58:24.917926   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.40 HWAddress:a2:fe:3f:a3:bd:5d ID:1,a2:fe:3f:a3:bd:5d Lease:0x65cd4f9a}
	I0213 15:58:24.917938   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.39 HWAddress:ea:4d:7a:fd:a5:26 ID:1,ea:4d:7a:fd:a5:26 Lease:0x65cd4f44}
	I0213 15:58:24.917946   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.38 HWAddress:ce:fc:41:79:7e:81 ID:1,ce:fc:41:79:7e:81 Lease:0x65cd4f35}
	I0213 15:58:24.917961   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.37 HWAddress:8e:65:f:4b:58:d4 ID:1,8e:65:f:4b:58:d4 Lease:0x65cd4eeb}
	I0213 15:58:24.917968   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.36 HWAddress:12:37:2b:87:75:f2 ID:1,12:37:2b:87:75:f2 Lease:0x65cd4edf}
	I0213 15:58:24.917974   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.35 HWAddress:42:5c:64:90:9a:96 ID:1,42:5c:64:90:9a:96 Lease:0x65cd4e83}
	I0213 15:58:24.917981   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.34 HWAddress:4e:43:dd:6e:d0:a4 ID:1,4e:43:dd:6e:d0:a4 Lease:0x65cd4e67}
	I0213 15:58:24.917991   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.33 HWAddress:c2:bc:f:d2:56:37 ID:1,c2:bc:f:d2:56:37 Lease:0x65cd4e21}
	I0213 15:58:24.918002   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.32 HWAddress:82:56:84:93:20:76 ID:1,82:56:84:93:20:76 Lease:0x65cd4e15}
	I0213 15:58:24.918019   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.31 HWAddress:86:68:12:94:e3:53 ID:1,86:68:12:94:e3:53 Lease:0x65cbfc90}
	I0213 15:58:24.918034   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.30 HWAddress:5a:9f:f7:d:31:86 ID:1,5a:9f:f7:d:31:86 Lease:0x65cbfc64}
	I0213 15:58:24.918043   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.29 HWAddress:2:da:94:be:eb:78 ID:1,2:da:94:be:eb:78 Lease:0x65cd4dad}
	I0213 15:58:24.918052   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.28 HWAddress:52:b1:93:68:d8:c9 ID:1,52:b1:93:68:d8:c9 Lease:0x65cd4d7e}
	I0213 15:58:24.918059   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.27 HWAddress:5e:f1:37:cb:1:ab ID:1,5e:f1:37:cb:1:ab Lease:0x65cd4d6f}
	I0213 15:58:24.918072   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.26 HWAddress:d2:58:e8:51:28:f4 ID:1,d2:58:e8:51:28:f4 Lease:0x65cd4cd6}
	I0213 15:58:24.918080   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:e2:b0:82:5e:37:7f ID:1,e2:b0:82:5e:37:7f Lease:0x65cd4c85}
	I0213 15:58:24.918088   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:1e:1c:11:76:5:69 ID:1,1e:1c:11:76:5:69 Lease:0x65cd4c55}
	I0213 15:58:24.918096   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:82:d6:3f:b2:ee:db ID:1,82:d6:3f:b2:ee:db Lease:0x65cbfaca}
	I0213 15:58:24.918105   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:36:e0:4a:c7:a:0 ID:1,36:e0:4a:c7:a:0 Lease:0x65cd4c07}
	I0213 15:58:24.918120   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:6:b2:24:3d:c0:93 ID:1,6:b2:24:3d:c0:93 Lease:0x65cd4be8}
	I0213 15:58:24.918129   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:56:5d:32:d9:58:54 ID:1,56:5d:32:d9:58:54 Lease:0x65cd4bce}
	I0213 15:58:24.918137   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:fa:8e:7c:83:4e:d9 ID:1,fa:8e:7c:83:4e:d9 Lease:0x65cd4b56}
	I0213 15:58:24.918145   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:2a:8f:bb:99:b7:16 ID:1,2a:8f:bb:99:b7:16 Lease:0x65cd4ae3}
	I0213 15:58:24.918152   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:1a:a1:86:8f:2b:ed ID:1,1a:a1:86:8f:2b:ed Lease:0x65cd4aaf}
	I0213 15:58:24.918161   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:42:72:7:ca:56:e ID:1,42:72:7:ca:56:e Lease:0x65cbf89a}
	I0213 15:58:24.918170   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:6e:db:44:6f:36:dd ID:1,6e:db:44:6f:36:dd Lease:0x65cbf810}
	I0213 15:58:24.918179   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6a:bc:38:de:80:fc ID:1,6a:bc:38:de:80:fc Lease:0x65cd49df}
	I0213 15:58:24.918186   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:94:35:f5:ee:46 ID:1,52:94:35:f5:ee:46 Lease:0x65cd49ab}
	I0213 15:58:24.918194   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:da:34:1f:c5:ee:5 ID:1,da:34:1f:c5:ee:5 Lease:0x65cbf65c}
	I0213 15:58:24.918202   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:fa:67:c7:f6:f1:e5 ID:1,fa:67:c7:f6:f1:e5 Lease:0x65cbf645}
	I0213 15:58:24.918210   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:26:3b:e2:e2:a5:a7 ID:1,26:3b:e2:e2:a5:a7 Lease:0x65cd477c}
	I0213 15:58:24.918217   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:d2:14:90:81:4:9d ID:1,d2:14:90:81:4:9d Lease:0x65cd4756}
	I0213 15:58:24.918226   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:7a:28:c5:9e:43:85 ID:1,7a:28:c5:9e:43:85 Lease:0x65cd4719}
	I0213 15:58:24.918234   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:4a:cd:b6:27:72:f6 ID:1,4a:cd:b6:27:72:f6 Lease:0x65cd4693}
	I0213 15:58:24.918242   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:66:d4:88:4d:a4:a6 ID:1,66:d4:88:4d:a4:a6 Lease:0x65cd464d}
	I0213 15:58:24.918255   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:22:72:80:65:85:f8 ID:1,22:72:80:65:85:f8 Lease:0x65cd4555}
	I0213 15:58:24.918264   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:4a:db:1c:ff:88:80 ID:1,4a:db:1c:ff:88:80 Lease:0x65cd4526}
	I0213 15:58:24.918278   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:72:e4:3c:27:f8:90 ID:1,72:e4:3c:27:f8:90 Lease:0x65cd43c3}
	I0213 15:58:24.918290   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name: IPAddress:192.169.0.2 HWAddress:a2:7f:c4:fc:f5:19 ID:1,a2:7f:c4:fc:f5:19 Lease:0x65cd40a2}
	I0213 15:58:26.918232   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Attempt 4
	I0213 15:58:26.918246   11066 main.go:141] libmachine: (newest-cni-173000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 15:58:26.918277   11066 main.go:141] libmachine: (newest-cni-173000) DBG | hyperkit pid from json: 11075
	I0213 15:58:26.919148   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Searching for 9e:48:95:9f:7c:44 in /var/db/dhcpd_leases ...
	I0213 15:58:26.919226   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Found 43 entries in /var/db/dhcpd_leases!
	I0213 15:58:26.919244   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.44 HWAddress:ba:b2:e8:3b:2f:ed ID:1,ba:b2:e8:3b:2f:ed Lease:0x65cd528c}
	I0213 15:58:26.919254   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.43 HWAddress:22:34:c6:19:4c:2e ID:1,22:34:c6:19:4c:2e Lease:0x65cd524f}
	I0213 15:58:26.919263   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.42 HWAddress:ce:91:bb:8e:b0:c5 ID:1,ce:91:bb:8e:b0:c5 Lease:0x65cd504a}
	I0213 15:58:26.919271   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.41 HWAddress:d2:c4:90:cf:75:d2 ID:1,d2:c4:90:cf:75:d2 Lease:0x65cd5059}
	I0213 15:58:26.919278   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.40 HWAddress:a2:fe:3f:a3:bd:5d ID:1,a2:fe:3f:a3:bd:5d Lease:0x65cd4f9a}
	I0213 15:58:26.919286   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.39 HWAddress:ea:4d:7a:fd:a5:26 ID:1,ea:4d:7a:fd:a5:26 Lease:0x65cd4f44}
	I0213 15:58:26.919295   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.38 HWAddress:ce:fc:41:79:7e:81 ID:1,ce:fc:41:79:7e:81 Lease:0x65cd4f35}
	I0213 15:58:26.919305   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.37 HWAddress:8e:65:f:4b:58:d4 ID:1,8e:65:f:4b:58:d4 Lease:0x65cd4eeb}
	I0213 15:58:26.919313   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.36 HWAddress:12:37:2b:87:75:f2 ID:1,12:37:2b:87:75:f2 Lease:0x65cd4edf}
	I0213 15:58:26.919320   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.35 HWAddress:42:5c:64:90:9a:96 ID:1,42:5c:64:90:9a:96 Lease:0x65cd4e83}
	I0213 15:58:26.919327   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.34 HWAddress:4e:43:dd:6e:d0:a4 ID:1,4e:43:dd:6e:d0:a4 Lease:0x65cd4e67}
	I0213 15:58:26.919336   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.33 HWAddress:c2:bc:f:d2:56:37 ID:1,c2:bc:f:d2:56:37 Lease:0x65cd4e21}
	I0213 15:58:26.919343   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.32 HWAddress:82:56:84:93:20:76 ID:1,82:56:84:93:20:76 Lease:0x65cd4e15}
	I0213 15:58:26.919353   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.31 HWAddress:86:68:12:94:e3:53 ID:1,86:68:12:94:e3:53 Lease:0x65cbfc90}
	I0213 15:58:26.919376   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.30 HWAddress:5a:9f:f7:d:31:86 ID:1,5a:9f:f7:d:31:86 Lease:0x65cbfc64}
	I0213 15:58:26.919388   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.29 HWAddress:2:da:94:be:eb:78 ID:1,2:da:94:be:eb:78 Lease:0x65cd4dad}
	I0213 15:58:26.919397   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.28 HWAddress:52:b1:93:68:d8:c9 ID:1,52:b1:93:68:d8:c9 Lease:0x65cd4d7e}
	I0213 15:58:26.919405   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.27 HWAddress:5e:f1:37:cb:1:ab ID:1,5e:f1:37:cb:1:ab Lease:0x65cd4d6f}
	I0213 15:58:26.919433   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.26 HWAddress:d2:58:e8:51:28:f4 ID:1,d2:58:e8:51:28:f4 Lease:0x65cd4cd6}
	I0213 15:58:26.919447   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:e2:b0:82:5e:37:7f ID:1,e2:b0:82:5e:37:7f Lease:0x65cd4c85}
	I0213 15:58:26.919456   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:1e:1c:11:76:5:69 ID:1,1e:1c:11:76:5:69 Lease:0x65cd4c55}
	I0213 15:58:26.919465   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:82:d6:3f:b2:ee:db ID:1,82:d6:3f:b2:ee:db Lease:0x65cbfaca}
	I0213 15:58:26.919472   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:36:e0:4a:c7:a:0 ID:1,36:e0:4a:c7:a:0 Lease:0x65cd4c07}
	I0213 15:58:26.919482   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:6:b2:24:3d:c0:93 ID:1,6:b2:24:3d:c0:93 Lease:0x65cd4be8}
	I0213 15:58:26.919491   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:56:5d:32:d9:58:54 ID:1,56:5d:32:d9:58:54 Lease:0x65cd4bce}
	I0213 15:58:26.919501   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:fa:8e:7c:83:4e:d9 ID:1,fa:8e:7c:83:4e:d9 Lease:0x65cd4b56}
	I0213 15:58:26.919514   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:2a:8f:bb:99:b7:16 ID:1,2a:8f:bb:99:b7:16 Lease:0x65cd4ae3}
	I0213 15:58:26.919534   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:1a:a1:86:8f:2b:ed ID:1,1a:a1:86:8f:2b:ed Lease:0x65cd4aaf}
	I0213 15:58:26.919543   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:42:72:7:ca:56:e ID:1,42:72:7:ca:56:e Lease:0x65cbf89a}
	I0213 15:58:26.919551   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:6e:db:44:6f:36:dd ID:1,6e:db:44:6f:36:dd Lease:0x65cbf810}
	I0213 15:58:26.919558   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6a:bc:38:de:80:fc ID:1,6a:bc:38:de:80:fc Lease:0x65cd49df}
	I0213 15:58:26.919565   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:94:35:f5:ee:46 ID:1,52:94:35:f5:ee:46 Lease:0x65cd49ab}
	I0213 15:58:26.919573   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:da:34:1f:c5:ee:5 ID:1,da:34:1f:c5:ee:5 Lease:0x65cbf65c}
	I0213 15:58:26.919580   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:fa:67:c7:f6:f1:e5 ID:1,fa:67:c7:f6:f1:e5 Lease:0x65cbf645}
	I0213 15:58:26.919587   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:26:3b:e2:e2:a5:a7 ID:1,26:3b:e2:e2:a5:a7 Lease:0x65cd477c}
	I0213 15:58:26.919593   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:d2:14:90:81:4:9d ID:1,d2:14:90:81:4:9d Lease:0x65cd4756}
	I0213 15:58:26.919600   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:7a:28:c5:9e:43:85 ID:1,7a:28:c5:9e:43:85 Lease:0x65cd4719}
	I0213 15:58:26.919609   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:4a:cd:b6:27:72:f6 ID:1,4a:cd:b6:27:72:f6 Lease:0x65cd4693}
	I0213 15:58:26.919616   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:66:d4:88:4d:a4:a6 ID:1,66:d4:88:4d:a4:a6 Lease:0x65cd464d}
	I0213 15:58:26.919629   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:22:72:80:65:85:f8 ID:1,22:72:80:65:85:f8 Lease:0x65cd4555}
	I0213 15:58:26.919641   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:4a:db:1c:ff:88:80 ID:1,4a:db:1c:ff:88:80 Lease:0x65cd4526}
	I0213 15:58:26.919649   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:72:e4:3c:27:f8:90 ID:1,72:e4:3c:27:f8:90 Lease:0x65cd43c3}
	I0213 15:58:26.919659   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name: IPAddress:192.169.0.2 HWAddress:a2:7f:c4:fc:f5:19 ID:1,a2:7f:c4:fc:f5:19 Lease:0x65cd40a2}
	I0213 15:58:28.920991   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Attempt 5
	I0213 15:58:28.921021   11066 main.go:141] libmachine: (newest-cni-173000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 15:58:28.921120   11066 main.go:141] libmachine: (newest-cni-173000) DBG | hyperkit pid from json: 11075
	I0213 15:58:28.922674   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Searching for 9e:48:95:9f:7c:44 in /var/db/dhcpd_leases ...
	I0213 15:58:28.922782   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Found 44 entries in /var/db/dhcpd_leases!
	I0213 15:58:28.922802   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.45 HWAddress:9e:48:95:9f:7c:44 ID:1,9e:48:95:9f:7c:44 Lease:0x65cd53a3}
	I0213 15:58:28.922834   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Found match: 9e:48:95:9f:7c:44
	I0213 15:58:28.922872   11066 main.go:141] libmachine: (newest-cni-173000) DBG | IP: 192.169.0.45
	I0213 15:58:28.923013   11066 main.go:141] libmachine: (newest-cni-173000) Calling .GetConfigRaw
	I0213 15:58:28.923934   11066 main.go:141] libmachine: (newest-cni-173000) Calling .DriverName
	I0213 15:58:28.924110   11066 main.go:141] libmachine: (newest-cni-173000) Calling .DriverName
	I0213 15:58:28.924252   11066 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0213 15:58:28.924271   11066 main.go:141] libmachine: (newest-cni-173000) Calling .GetState
	I0213 15:58:28.924382   11066 main.go:141] libmachine: (newest-cni-173000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 15:58:28.924456   11066 main.go:141] libmachine: (newest-cni-173000) DBG | hyperkit pid from json: 11075
	I0213 15:58:28.925582   11066 main.go:141] libmachine: Detecting operating system of created instance...
	I0213 15:58:28.925597   11066 main.go:141] libmachine: Waiting for SSH to be available...
	I0213 15:58:28.925615   11066 main.go:141] libmachine: Getting to WaitForSSH function...
	I0213 15:58:28.925624   11066 main.go:141] libmachine: (newest-cni-173000) Calling .GetSSHHostname
	I0213 15:58:28.925748   11066 main.go:141] libmachine: (newest-cni-173000) Calling .GetSSHPort
	I0213 15:58:28.925868   11066 main.go:141] libmachine: (newest-cni-173000) Calling .GetSSHKeyPath
	I0213 15:58:28.925973   11066 main.go:141] libmachine: (newest-cni-173000) Calling .GetSSHKeyPath
	I0213 15:58:28.926081   11066 main.go:141] libmachine: (newest-cni-173000) Calling .GetSSHUsername
	I0213 15:58:28.926248   11066 main.go:141] libmachine: Using SSH client type: native
	I0213 15:58:28.926549   11066 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1407080] 0x1409d60 <nil>  [] 0s} 192.169.0.45 22 <nil> <nil>}
	I0213 15:58:28.926560   11066 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0213 15:59:43.928267   11066 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.45:22: connect: operation timed out
	I0213 16:01:01.931752   11066 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.45:22: connect: operation timed out
	I0213 16:02:19.937073   11066 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.45:22: connect: operation timed out
	I0213 16:03:37.940524   11066 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.45:22: connect: operation timed out
	I0213 16:04:18.210990   11066 start.go:128] duration metric: createHost completed in 6m0.05284288s
	I0213 16:04:18.211017   11066 start.go:83] releasing machines lock for "newest-cni-173000", held for 6m0.052984278s
	W0213 16:04:18.211040   11066 start.go:694] error starting host: creating host: create host timed out in 360.000000 seconds
	I0213 16:04:18.211522   11066 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 16:04:18.211544   11066 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 16:04:18.220196   11066 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:57573
	I0213 16:04:18.220599   11066 main.go:141] libmachine: () Calling .GetVersion
	I0213 16:04:18.221005   11066 main.go:141] libmachine: Using API Version  1
	I0213 16:04:18.221025   11066 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 16:04:18.221255   11066 main.go:141] libmachine: () Calling .GetMachineName
	I0213 16:04:18.221611   11066 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 16:04:18.221641   11066 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 16:04:18.229686   11066 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:57575
	I0213 16:04:18.230019   11066 main.go:141] libmachine: () Calling .GetVersion
	I0213 16:04:18.230375   11066 main.go:141] libmachine: Using API Version  1
	I0213 16:04:18.230390   11066 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 16:04:18.230606   11066 main.go:141] libmachine: () Calling .GetMachineName
	I0213 16:04:18.230705   11066 main.go:141] libmachine: (newest-cni-173000) Calling .GetState
	I0213 16:04:18.230777   11066 main.go:141] libmachine: (newest-cni-173000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 16:04:18.230856   11066 main.go:141] libmachine: (newest-cni-173000) DBG | hyperkit pid from json: 11075
	I0213 16:04:18.231802   11066 main.go:141] libmachine: (newest-cni-173000) Calling .DriverName
	I0213 16:04:18.253134   11066 out.go:177] * Deleting "newest-cni-173000" in hyperkit ...
	I0213 16:04:18.312334   11066 main.go:141] libmachine: (newest-cni-173000) Calling .Remove
	I0213 16:04:18.312601   11066 main.go:141] libmachine: (newest-cni-173000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 16:04:18.312643   11066 main.go:141] libmachine: (newest-cni-173000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 16:04:18.312673   11066 main.go:141] libmachine: (newest-cni-173000) DBG | hyperkit pid from json: 11075
	I0213 16:04:18.313803   11066 main.go:141] libmachine: (newest-cni-173000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 16:04:18.313842   11066 main.go:141] libmachine: (newest-cni-173000) DBG | waiting for graceful shutdown
	I0213 16:04:18.455615   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:04:18 INFO : hyperkit: stdout: linkname /Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/tty
	I0213 16:04:18.455643   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:04:18 INFO : hyperkit: stdout: COM1 connected to /dev/ttys001
	I0213 16:04:18.466246   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:04:18 WARN : hyperkit: failed to read stdout: EOF
	I0213 16:04:18.466277   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:04:18 WARN : hyperkit: failed to read stderr: EOF
	I0213 16:04:19.314083   11066 main.go:141] libmachine: (newest-cni-173000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 16:04:19.314220   11066 main.go:141] libmachine: (newest-cni-173000) DBG | hyperkit pid from json: 11075
	I0213 16:04:19.315291   11066 main.go:141] libmachine: (newest-cni-173000) DBG | hyperkit pid 11075 missing from process table
	W0213 16:04:19.426587   11066 out.go:239] ! StartHost failed, but will try again: creating host: create host timed out in 360.000000 seconds
	! StartHost failed, but will try again: creating host: create host timed out in 360.000000 seconds
	I0213 16:04:19.426602   11066 start.go:709] Will try again in 5 seconds ...
	I0213 16:04:24.427011   11066 start.go:365] acquiring machines lock for newest-cni-173000: {Name:mke947868f35224fa4aab1d5f0a66de1e12a8270 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0213 16:04:24.427150   11066 start.go:369] acquired machines lock for "newest-cni-173000" in 109.591µs
	I0213 16:04:24.427173   11066 start.go:93] Provisioning new machine with config: &{Name:newest-cni-173000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/17866/minikube-v1.32.1-1703784139-17866-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1704759386-17866@sha256:8c3c33047f9bc285e1f5f2a5aa14744a2fe04c58478f02f77b06169dea8dd3f0 Memory:2200 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 Kubernet
esConfig:{KubernetesVersion:v1.29.0-rc.2 ClusterName:newest-cni-173000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates:ServerSideApply=true ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:kubeadm Key:pod-network-cidr Value:10.42.0.0/16}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.29.0-rc.2 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:false default_sa:true extra:false kubelet:false node_ready:false system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false Mount
String:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs:} &{Name: IP: Port:8443 KubernetesVersion:v1.29.0-rc.2 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0213 16:04:24.427246   11066 start.go:125] createHost starting for "" (driver="hyperkit")
	I0213 16:04:24.448994   11066 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0213 16:04:24.449112   11066 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 16:04:24.449151   11066 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 16:04:24.457909   11066 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:57577
	I0213 16:04:24.458268   11066 main.go:141] libmachine: () Calling .GetVersion
	I0213 16:04:24.458627   11066 main.go:141] libmachine: Using API Version  1
	I0213 16:04:24.458638   11066 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 16:04:24.458879   11066 main.go:141] libmachine: () Calling .GetMachineName
	I0213 16:04:24.458989   11066 main.go:141] libmachine: (newest-cni-173000) Calling .GetMachineName
	I0213 16:04:24.459094   11066 main.go:141] libmachine: (newest-cni-173000) Calling .DriverName
	I0213 16:04:24.459213   11066 start.go:159] libmachine.API.Create for "newest-cni-173000" (driver="hyperkit")
	I0213 16:04:24.459236   11066 client.go:168] LocalClient.Create starting
	I0213 16:04:24.459264   11066 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18169-2790/.minikube/certs/ca.pem
	I0213 16:04:24.459310   11066 main.go:141] libmachine: Decoding PEM data...
	I0213 16:04:24.459324   11066 main.go:141] libmachine: Parsing certificate...
	I0213 16:04:24.459366   11066 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18169-2790/.minikube/certs/cert.pem
	I0213 16:04:24.459401   11066 main.go:141] libmachine: Decoding PEM data...
	I0213 16:04:24.459415   11066 main.go:141] libmachine: Parsing certificate...
	I0213 16:04:24.459428   11066 main.go:141] libmachine: Running pre-create checks...
	I0213 16:04:24.459434   11066 main.go:141] libmachine: (newest-cni-173000) Calling .PreCreateCheck
	I0213 16:04:24.459515   11066 main.go:141] libmachine: (newest-cni-173000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 16:04:24.459540   11066 main.go:141] libmachine: (newest-cni-173000) Calling .GetConfigRaw
	I0213 16:04:24.459979   11066 main.go:141] libmachine: Creating machine...
	I0213 16:04:24.459988   11066 main.go:141] libmachine: (newest-cni-173000) Calling .Create
	I0213 16:04:24.460057   11066 main.go:141] libmachine: (newest-cni-173000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 16:04:24.460186   11066 main.go:141] libmachine: (newest-cni-173000) DBG | I0213 16:04:24.460056   11144 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/18169-2790/.minikube
	I0213 16:04:24.460237   11066 main.go:141] libmachine: (newest-cni-173000) Downloading /Users/jenkins/minikube-integration/18169-2790/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/18169-2790/.minikube/cache/iso/amd64/minikube-v1.32.1-1703784139-17866-amd64.iso...
	I0213 16:04:24.626872   11066 main.go:141] libmachine: (newest-cni-173000) DBG | I0213 16:04:24.626806   11144 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/id_rsa...
	I0213 16:04:24.873346   11066 main.go:141] libmachine: (newest-cni-173000) DBG | I0213 16:04:24.873253   11144 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/newest-cni-173000.rawdisk...
	I0213 16:04:24.873364   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Writing magic tar header
	I0213 16:04:24.873378   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Writing SSH key tar header
	I0213 16:04:24.873980   11066 main.go:141] libmachine: (newest-cni-173000) DBG | I0213 16:04:24.873948   11144 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000 ...
	I0213 16:04:25.206330   11066 main.go:141] libmachine: (newest-cni-173000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 16:04:25.206362   11066 main.go:141] libmachine: (newest-cni-173000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/hyperkit.pid
	I0213 16:04:25.206379   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Using UUID 4b16baee-b9c8-419f-8f65-773ebb2e9bdb
	I0213 16:04:25.244660   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Generated MAC e6:f9:f6:2c:6f:21
	I0213 16:04:25.244678   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=newest-cni-173000
	I0213 16:04:25.244712   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:04:25 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"4b16baee-b9c8-419f-8f65-773ebb2e9bdb", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc000110450)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.
Process)(nil)}
	I0213 16:04:25.244747   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:04:25 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"4b16baee-b9c8-419f-8f65-773ebb2e9bdb", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc000110450)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.
Process)(nil)}
	I0213 16:04:25.244809   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:04:25 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "4b16baee-b9c8-419f-8f65-773ebb2e9bdb", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/newest-cni-173000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/tty,log=/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/bzimage,/Users/jenkins/minikube-integrat
ion/18169-2790/.minikube/machines/newest-cni-173000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=newest-cni-173000"}
	I0213 16:04:25.244846   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:04:25 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 4b16baee-b9c8-419f-8f65-773ebb2e9bdb -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/newest-cni-173000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/tty,log=/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/console-ring -f kexec,/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/bzimage,/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/initrd,ear
lyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=newest-cni-173000"
	I0213 16:04:25.244863   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:04:25 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0213 16:04:25.247589   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:04:25 DEBUG: hyperkit: Pid is 11145
	I0213 16:04:25.247988   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Attempt 0
	I0213 16:04:25.247999   11066 main.go:141] libmachine: (newest-cni-173000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 16:04:25.248098   11066 main.go:141] libmachine: (newest-cni-173000) DBG | hyperkit pid from json: 11145
	I0213 16:04:25.248981   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Searching for e6:f9:f6:2c:6f:21 in /var/db/dhcpd_leases ...
	I0213 16:04:25.249114   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Found 44 entries in /var/db/dhcpd_leases!
	I0213 16:04:25.249141   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.45 HWAddress:9e:48:95:9f:7c:44 ID:1,9e:48:95:9f:7c:44 Lease:0x65cc0382}
	I0213 16:04:25.249227   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.44 HWAddress:ba:b2:e8:3b:2f:ed ID:1,ba:b2:e8:3b:2f:ed Lease:0x65cd528c}
	I0213 16:04:25.249247   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.43 HWAddress:22:34:c6:19:4c:2e ID:1,22:34:c6:19:4c:2e Lease:0x65cd524f}
	I0213 16:04:25.249257   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.42 HWAddress:ce:91:bb:8e:b0:c5 ID:1,ce:91:bb:8e:b0:c5 Lease:0x65cd504a}
	I0213 16:04:25.249264   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.41 HWAddress:d2:c4:90:cf:75:d2 ID:1,d2:c4:90:cf:75:d2 Lease:0x65cd5059}
	I0213 16:04:25.249299   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.40 HWAddress:a2:fe:3f:a3:bd:5d ID:1,a2:fe:3f:a3:bd:5d Lease:0x65cd4f9a}
	I0213 16:04:25.249317   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.39 HWAddress:ea:4d:7a:fd:a5:26 ID:1,ea:4d:7a:fd:a5:26 Lease:0x65cd4f44}
	I0213 16:04:25.249331   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.38 HWAddress:ce:fc:41:79:7e:81 ID:1,ce:fc:41:79:7e:81 Lease:0x65cd4f35}
	I0213 16:04:25.249340   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.37 HWAddress:8e:65:f:4b:58:d4 ID:1,8e:65:f:4b:58:d4 Lease:0x65cd4eeb}
	I0213 16:04:25.249349   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.36 HWAddress:12:37:2b:87:75:f2 ID:1,12:37:2b:87:75:f2 Lease:0x65cd4edf}
	I0213 16:04:25.249362   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.35 HWAddress:42:5c:64:90:9a:96 ID:1,42:5c:64:90:9a:96 Lease:0x65cd4e83}
	I0213 16:04:25.249371   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.34 HWAddress:4e:43:dd:6e:d0:a4 ID:1,4e:43:dd:6e:d0:a4 Lease:0x65cd4e67}
	I0213 16:04:25.249381   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.33 HWAddress:c2:bc:f:d2:56:37 ID:1,c2:bc:f:d2:56:37 Lease:0x65cd4e21}
	I0213 16:04:25.249402   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.32 HWAddress:82:56:84:93:20:76 ID:1,82:56:84:93:20:76 Lease:0x65cd4e15}
	I0213 16:04:25.249424   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.31 HWAddress:86:68:12:94:e3:53 ID:1,86:68:12:94:e3:53 Lease:0x65cbfc90}
	I0213 16:04:25.249438   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.30 HWAddress:5a:9f:f7:d:31:86 ID:1,5a:9f:f7:d:31:86 Lease:0x65cbfc64}
	I0213 16:04:25.249450   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.29 HWAddress:2:da:94:be:eb:78 ID:1,2:da:94:be:eb:78 Lease:0x65cd4dad}
	I0213 16:04:25.249467   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.28 HWAddress:52:b1:93:68:d8:c9 ID:1,52:b1:93:68:d8:c9 Lease:0x65cd4d7e}
	I0213 16:04:25.249481   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.27 HWAddress:5e:f1:37:cb:1:ab ID:1,5e:f1:37:cb:1:ab Lease:0x65cd4d6f}
	I0213 16:04:25.249492   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.26 HWAddress:d2:58:e8:51:28:f4 ID:1,d2:58:e8:51:28:f4 Lease:0x65cd4cd6}
	I0213 16:04:25.249501   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:e2:b0:82:5e:37:7f ID:1,e2:b0:82:5e:37:7f Lease:0x65cd4c85}
	I0213 16:04:25.249522   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:1e:1c:11:76:5:69 ID:1,1e:1c:11:76:5:69 Lease:0x65cd4c55}
	I0213 16:04:25.249536   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:82:d6:3f:b2:ee:db ID:1,82:d6:3f:b2:ee:db Lease:0x65cbfaca}
	I0213 16:04:25.249544   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:36:e0:4a:c7:a:0 ID:1,36:e0:4a:c7:a:0 Lease:0x65cd4c07}
	I0213 16:04:25.249554   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:6:b2:24:3d:c0:93 ID:1,6:b2:24:3d:c0:93 Lease:0x65cd4be8}
	I0213 16:04:25.249561   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:56:5d:32:d9:58:54 ID:1,56:5d:32:d9:58:54 Lease:0x65cd4bce}
	I0213 16:04:25.249568   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:fa:8e:7c:83:4e:d9 ID:1,fa:8e:7c:83:4e:d9 Lease:0x65cd4b56}
	I0213 16:04:25.249584   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:2a:8f:bb:99:b7:16 ID:1,2a:8f:bb:99:b7:16 Lease:0x65cd4ae3}
	I0213 16:04:25.249594   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:1a:a1:86:8f:2b:ed ID:1,1a:a1:86:8f:2b:ed Lease:0x65cd4aaf}
	I0213 16:04:25.249613   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:42:72:7:ca:56:e ID:1,42:72:7:ca:56:e Lease:0x65cbf89a}
	I0213 16:04:25.249626   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:6e:db:44:6f:36:dd ID:1,6e:db:44:6f:36:dd Lease:0x65cbf810}
	I0213 16:04:25.249636   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6a:bc:38:de:80:fc ID:1,6a:bc:38:de:80:fc Lease:0x65cd49df}
	I0213 16:04:25.249647   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:94:35:f5:ee:46 ID:1,52:94:35:f5:ee:46 Lease:0x65cd49ab}
	I0213 16:04:25.249656   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:da:34:1f:c5:ee:5 ID:1,da:34:1f:c5:ee:5 Lease:0x65cbf65c}
	I0213 16:04:25.249666   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:fa:67:c7:f6:f1:e5 ID:1,fa:67:c7:f6:f1:e5 Lease:0x65cbf645}
	I0213 16:04:25.249685   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:26:3b:e2:e2:a5:a7 ID:1,26:3b:e2:e2:a5:a7 Lease:0x65cd477c}
	I0213 16:04:25.249703   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:d2:14:90:81:4:9d ID:1,d2:14:90:81:4:9d Lease:0x65cd4756}
	I0213 16:04:25.249715   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:7a:28:c5:9e:43:85 ID:1,7a:28:c5:9e:43:85 Lease:0x65cd4719}
	I0213 16:04:25.249729   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:4a:cd:b6:27:72:f6 ID:1,4a:cd:b6:27:72:f6 Lease:0x65cd4693}
	I0213 16:04:25.249737   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:66:d4:88:4d:a4:a6 ID:1,66:d4:88:4d:a4:a6 Lease:0x65cd464d}
	I0213 16:04:25.249746   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:22:72:80:65:85:f8 ID:1,22:72:80:65:85:f8 Lease:0x65cd4555}
	I0213 16:04:25.249755   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:4a:db:1c:ff:88:80 ID:1,4a:db:1c:ff:88:80 Lease:0x65cd4526}
	I0213 16:04:25.249784   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:72:e4:3c:27:f8:90 ID:1,72:e4:3c:27:f8:90 Lease:0x65cd43c3}
	I0213 16:04:25.249805   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name: IPAddress:192.169.0.2 HWAddress:a2:7f:c4:fc:f5:19 ID:1,a2:7f:c4:fc:f5:19 Lease:0x65cd40a2}
	I0213 16:04:25.255007   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:04:25 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0213 16:04:25.263467   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:04:25 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0213 16:04:25.264457   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:04:25 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0213 16:04:25.264473   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:04:25 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0213 16:04:25.264482   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:04:25 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0213 16:04:25.264492   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:04:25 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0213 16:04:25.633456   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:04:25 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0213 16:04:25.633476   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:04:25 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0213 16:04:25.737567   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:04:25 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0213 16:04:25.737807   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:04:25 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0213 16:04:25.737820   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:04:25 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0213 16:04:25.737826   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:04:25 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0213 16:04:25.738438   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:04:25 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0213 16:04:25.738446   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:04:25 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0213 16:04:27.250241   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Attempt 1
	I0213 16:04:27.250265   11066 main.go:141] libmachine: (newest-cni-173000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 16:04:27.250348   11066 main.go:141] libmachine: (newest-cni-173000) DBG | hyperkit pid from json: 11145
	I0213 16:04:27.251177   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Searching for e6:f9:f6:2c:6f:21 in /var/db/dhcpd_leases ...
	I0213 16:04:27.251264   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Found 44 entries in /var/db/dhcpd_leases!
	I0213 16:04:27.251276   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.45 HWAddress:9e:48:95:9f:7c:44 ID:1,9e:48:95:9f:7c:44 Lease:0x65cc0382}
	I0213 16:04:27.251299   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.44 HWAddress:ba:b2:e8:3b:2f:ed ID:1,ba:b2:e8:3b:2f:ed Lease:0x65cd528c}
	I0213 16:04:27.251310   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.43 HWAddress:22:34:c6:19:4c:2e ID:1,22:34:c6:19:4c:2e Lease:0x65cd524f}
	I0213 16:04:27.251318   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.42 HWAddress:ce:91:bb:8e:b0:c5 ID:1,ce:91:bb:8e:b0:c5 Lease:0x65cd504a}
	I0213 16:04:27.251325   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.41 HWAddress:d2:c4:90:cf:75:d2 ID:1,d2:c4:90:cf:75:d2 Lease:0x65cd5059}
	I0213 16:04:27.251332   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.40 HWAddress:a2:fe:3f:a3:bd:5d ID:1,a2:fe:3f:a3:bd:5d Lease:0x65cd4f9a}
	I0213 16:04:27.251349   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.39 HWAddress:ea:4d:7a:fd:a5:26 ID:1,ea:4d:7a:fd:a5:26 Lease:0x65cd4f44}
	I0213 16:04:27.251366   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.38 HWAddress:ce:fc:41:79:7e:81 ID:1,ce:fc:41:79:7e:81 Lease:0x65cd4f35}
	I0213 16:04:27.251375   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.37 HWAddress:8e:65:f:4b:58:d4 ID:1,8e:65:f:4b:58:d4 Lease:0x65cd4eeb}
	I0213 16:04:27.251382   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.36 HWAddress:12:37:2b:87:75:f2 ID:1,12:37:2b:87:75:f2 Lease:0x65cd4edf}
	I0213 16:04:27.251392   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.35 HWAddress:42:5c:64:90:9a:96 ID:1,42:5c:64:90:9a:96 Lease:0x65cd4e83}
	I0213 16:04:27.251402   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.34 HWAddress:4e:43:dd:6e:d0:a4 ID:1,4e:43:dd:6e:d0:a4 Lease:0x65cd4e67}
	I0213 16:04:27.251417   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.33 HWAddress:c2:bc:f:d2:56:37 ID:1,c2:bc:f:d2:56:37 Lease:0x65cd4e21}
	I0213 16:04:27.251429   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.32 HWAddress:82:56:84:93:20:76 ID:1,82:56:84:93:20:76 Lease:0x65cd4e15}
	I0213 16:04:27.251440   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.31 HWAddress:86:68:12:94:e3:53 ID:1,86:68:12:94:e3:53 Lease:0x65cbfc90}
	I0213 16:04:27.251448   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.30 HWAddress:5a:9f:f7:d:31:86 ID:1,5a:9f:f7:d:31:86 Lease:0x65cbfc64}
	I0213 16:04:27.251456   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.29 HWAddress:2:da:94:be:eb:78 ID:1,2:da:94:be:eb:78 Lease:0x65cd4dad}
	I0213 16:04:27.251462   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.28 HWAddress:52:b1:93:68:d8:c9 ID:1,52:b1:93:68:d8:c9 Lease:0x65cd4d7e}
	I0213 16:04:27.251474   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.27 HWAddress:5e:f1:37:cb:1:ab ID:1,5e:f1:37:cb:1:ab Lease:0x65cd4d6f}
	I0213 16:04:27.251483   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.26 HWAddress:d2:58:e8:51:28:f4 ID:1,d2:58:e8:51:28:f4 Lease:0x65cd4cd6}
	I0213 16:04:27.251491   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:e2:b0:82:5e:37:7f ID:1,e2:b0:82:5e:37:7f Lease:0x65cd4c85}
	I0213 16:04:27.251498   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:1e:1c:11:76:5:69 ID:1,1e:1c:11:76:5:69 Lease:0x65cd4c55}
	I0213 16:04:27.251511   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:82:d6:3f:b2:ee:db ID:1,82:d6:3f:b2:ee:db Lease:0x65cbfaca}
	I0213 16:04:27.251519   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:36:e0:4a:c7:a:0 ID:1,36:e0:4a:c7:a:0 Lease:0x65cd4c07}
	I0213 16:04:27.251528   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:6:b2:24:3d:c0:93 ID:1,6:b2:24:3d:c0:93 Lease:0x65cd4be8}
	I0213 16:04:27.251538   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:56:5d:32:d9:58:54 ID:1,56:5d:32:d9:58:54 Lease:0x65cd4bce}
	I0213 16:04:27.251546   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:fa:8e:7c:83:4e:d9 ID:1,fa:8e:7c:83:4e:d9 Lease:0x65cd4b56}
	I0213 16:04:27.251554   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:2a:8f:bb:99:b7:16 ID:1,2a:8f:bb:99:b7:16 Lease:0x65cd4ae3}
	I0213 16:04:27.251562   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:1a:a1:86:8f:2b:ed ID:1,1a:a1:86:8f:2b:ed Lease:0x65cd4aaf}
	I0213 16:04:27.251570   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:42:72:7:ca:56:e ID:1,42:72:7:ca:56:e Lease:0x65cbf89a}
	I0213 16:04:27.251578   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:6e:db:44:6f:36:dd ID:1,6e:db:44:6f:36:dd Lease:0x65cbf810}
	I0213 16:04:27.251587   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6a:bc:38:de:80:fc ID:1,6a:bc:38:de:80:fc Lease:0x65cd49df}
	I0213 16:04:27.251597   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:94:35:f5:ee:46 ID:1,52:94:35:f5:ee:46 Lease:0x65cd49ab}
	I0213 16:04:27.251605   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:da:34:1f:c5:ee:5 ID:1,da:34:1f:c5:ee:5 Lease:0x65cbf65c}
	I0213 16:04:27.251618   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:fa:67:c7:f6:f1:e5 ID:1,fa:67:c7:f6:f1:e5 Lease:0x65cbf645}
	I0213 16:04:27.251635   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:26:3b:e2:e2:a5:a7 ID:1,26:3b:e2:e2:a5:a7 Lease:0x65cd477c}
	I0213 16:04:27.251650   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:d2:14:90:81:4:9d ID:1,d2:14:90:81:4:9d Lease:0x65cd4756}
	I0213 16:04:27.251660   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:7a:28:c5:9e:43:85 ID:1,7a:28:c5:9e:43:85 Lease:0x65cd4719}
	I0213 16:04:27.251669   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:4a:cd:b6:27:72:f6 ID:1,4a:cd:b6:27:72:f6 Lease:0x65cd4693}
	I0213 16:04:27.251677   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:66:d4:88:4d:a4:a6 ID:1,66:d4:88:4d:a4:a6 Lease:0x65cd464d}
	I0213 16:04:27.251685   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:22:72:80:65:85:f8 ID:1,22:72:80:65:85:f8 Lease:0x65cd4555}
	I0213 16:04:27.251693   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:4a:db:1c:ff:88:80 ID:1,4a:db:1c:ff:88:80 Lease:0x65cd4526}
	I0213 16:04:27.251717   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:72:e4:3c:27:f8:90 ID:1,72:e4:3c:27:f8:90 Lease:0x65cd43c3}
	I0213 16:04:27.251725   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name: IPAddress:192.169.0.2 HWAddress:a2:7f:c4:fc:f5:19 ID:1,a2:7f:c4:fc:f5:19 Lease:0x65cd40a2}
	I0213 16:04:29.252259   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Attempt 2
	I0213 16:04:29.252277   11066 main.go:141] libmachine: (newest-cni-173000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 16:04:29.252331   11066 main.go:141] libmachine: (newest-cni-173000) DBG | hyperkit pid from json: 11145
	I0213 16:04:29.253152   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Searching for e6:f9:f6:2c:6f:21 in /var/db/dhcpd_leases ...
	I0213 16:04:29.253232   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Found 44 entries in /var/db/dhcpd_leases!
	I0213 16:04:29.253243   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.45 HWAddress:9e:48:95:9f:7c:44 ID:1,9e:48:95:9f:7c:44 Lease:0x65cc0382}
	I0213 16:04:29.253253   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.44 HWAddress:ba:b2:e8:3b:2f:ed ID:1,ba:b2:e8:3b:2f:ed Lease:0x65cd528c}
	I0213 16:04:29.253260   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.43 HWAddress:22:34:c6:19:4c:2e ID:1,22:34:c6:19:4c:2e Lease:0x65cd524f}
	I0213 16:04:29.253273   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.42 HWAddress:ce:91:bb:8e:b0:c5 ID:1,ce:91:bb:8e:b0:c5 Lease:0x65cd504a}
	I0213 16:04:29.253283   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.41 HWAddress:d2:c4:90:cf:75:d2 ID:1,d2:c4:90:cf:75:d2 Lease:0x65cd5059}
	I0213 16:04:29.253292   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.40 HWAddress:a2:fe:3f:a3:bd:5d ID:1,a2:fe:3f:a3:bd:5d Lease:0x65cd4f9a}
	I0213 16:04:29.253298   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.39 HWAddress:ea:4d:7a:fd:a5:26 ID:1,ea:4d:7a:fd:a5:26 Lease:0x65cd4f44}
	I0213 16:04:29.253313   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.38 HWAddress:ce:fc:41:79:7e:81 ID:1,ce:fc:41:79:7e:81 Lease:0x65cd4f35}
	I0213 16:04:29.253331   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.37 HWAddress:8e:65:f:4b:58:d4 ID:1,8e:65:f:4b:58:d4 Lease:0x65cd4eeb}
	I0213 16:04:29.253341   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.36 HWAddress:12:37:2b:87:75:f2 ID:1,12:37:2b:87:75:f2 Lease:0x65cd4edf}
	I0213 16:04:29.253351   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.35 HWAddress:42:5c:64:90:9a:96 ID:1,42:5c:64:90:9a:96 Lease:0x65cd4e83}
	I0213 16:04:29.253359   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.34 HWAddress:4e:43:dd:6e:d0:a4 ID:1,4e:43:dd:6e:d0:a4 Lease:0x65cd4e67}
	I0213 16:04:29.253366   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.33 HWAddress:c2:bc:f:d2:56:37 ID:1,c2:bc:f:d2:56:37 Lease:0x65cd4e21}
	I0213 16:04:29.253384   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.32 HWAddress:82:56:84:93:20:76 ID:1,82:56:84:93:20:76 Lease:0x65cd4e15}
	I0213 16:04:29.253396   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.31 HWAddress:86:68:12:94:e3:53 ID:1,86:68:12:94:e3:53 Lease:0x65cbfc90}
	I0213 16:04:29.253413   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.30 HWAddress:5a:9f:f7:d:31:86 ID:1,5a:9f:f7:d:31:86 Lease:0x65cbfc64}
	I0213 16:04:29.253429   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.29 HWAddress:2:da:94:be:eb:78 ID:1,2:da:94:be:eb:78 Lease:0x65cd4dad}
	I0213 16:04:29.253440   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.28 HWAddress:52:b1:93:68:d8:c9 ID:1,52:b1:93:68:d8:c9 Lease:0x65cd4d7e}
	I0213 16:04:29.253448   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.27 HWAddress:5e:f1:37:cb:1:ab ID:1,5e:f1:37:cb:1:ab Lease:0x65cd4d6f}
	I0213 16:04:29.253456   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.26 HWAddress:d2:58:e8:51:28:f4 ID:1,d2:58:e8:51:28:f4 Lease:0x65cd4cd6}
	I0213 16:04:29.253465   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:e2:b0:82:5e:37:7f ID:1,e2:b0:82:5e:37:7f Lease:0x65cd4c85}
	I0213 16:04:29.253473   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:1e:1c:11:76:5:69 ID:1,1e:1c:11:76:5:69 Lease:0x65cd4c55}
	I0213 16:04:29.253481   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:82:d6:3f:b2:ee:db ID:1,82:d6:3f:b2:ee:db Lease:0x65cbfaca}
	I0213 16:04:29.253492   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:36:e0:4a:c7:a:0 ID:1,36:e0:4a:c7:a:0 Lease:0x65cd4c07}
	I0213 16:04:29.253501   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:6:b2:24:3d:c0:93 ID:1,6:b2:24:3d:c0:93 Lease:0x65cd4be8}
	I0213 16:04:29.253509   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:56:5d:32:d9:58:54 ID:1,56:5d:32:d9:58:54 Lease:0x65cd4bce}
	I0213 16:04:29.253517   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:fa:8e:7c:83:4e:d9 ID:1,fa:8e:7c:83:4e:d9 Lease:0x65cd4b56}
	I0213 16:04:29.253524   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:2a:8f:bb:99:b7:16 ID:1,2a:8f:bb:99:b7:16 Lease:0x65cd4ae3}
	I0213 16:04:29.253533   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:1a:a1:86:8f:2b:ed ID:1,1a:a1:86:8f:2b:ed Lease:0x65cd4aaf}
	I0213 16:04:29.253541   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:42:72:7:ca:56:e ID:1,42:72:7:ca:56:e Lease:0x65cbf89a}
	I0213 16:04:29.253549   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:6e:db:44:6f:36:dd ID:1,6e:db:44:6f:36:dd Lease:0x65cbf810}
	I0213 16:04:29.253557   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6a:bc:38:de:80:fc ID:1,6a:bc:38:de:80:fc Lease:0x65cd49df}
	I0213 16:04:29.253565   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:94:35:f5:ee:46 ID:1,52:94:35:f5:ee:46 Lease:0x65cd49ab}
	I0213 16:04:29.253572   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:da:34:1f:c5:ee:5 ID:1,da:34:1f:c5:ee:5 Lease:0x65cbf65c}
	I0213 16:04:29.253586   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:fa:67:c7:f6:f1:e5 ID:1,fa:67:c7:f6:f1:e5 Lease:0x65cbf645}
	I0213 16:04:29.253598   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:26:3b:e2:e2:a5:a7 ID:1,26:3b:e2:e2:a5:a7 Lease:0x65cd477c}
	I0213 16:04:29.253609   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:d2:14:90:81:4:9d ID:1,d2:14:90:81:4:9d Lease:0x65cd4756}
	I0213 16:04:29.253617   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:7a:28:c5:9e:43:85 ID:1,7a:28:c5:9e:43:85 Lease:0x65cd4719}
	I0213 16:04:29.253626   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:4a:cd:b6:27:72:f6 ID:1,4a:cd:b6:27:72:f6 Lease:0x65cd4693}
	I0213 16:04:29.253635   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:66:d4:88:4d:a4:a6 ID:1,66:d4:88:4d:a4:a6 Lease:0x65cd464d}
	I0213 16:04:29.253644   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:22:72:80:65:85:f8 ID:1,22:72:80:65:85:f8 Lease:0x65cd4555}
	I0213 16:04:29.253656   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:4a:db:1c:ff:88:80 ID:1,4a:db:1c:ff:88:80 Lease:0x65cd4526}
	I0213 16:04:29.253668   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:72:e4:3c:27:f8:90 ID:1,72:e4:3c:27:f8:90 Lease:0x65cd43c3}
	I0213 16:04:29.253685   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name: IPAddress:192.169.0.2 HWAddress:a2:7f:c4:fc:f5:19 ID:1,a2:7f:c4:fc:f5:19 Lease:0x65cd40a2}
	I0213 16:04:30.667354   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:04:30 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0213 16:04:30.667370   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:04:30 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0213 16:04:30.667381   11066 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:04:30 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0213 16:04:31.253952   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Attempt 3
	I0213 16:04:31.253970   11066 main.go:141] libmachine: (newest-cni-173000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 16:04:31.254073   11066 main.go:141] libmachine: (newest-cni-173000) DBG | hyperkit pid from json: 11145
	I0213 16:04:31.254879   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Searching for e6:f9:f6:2c:6f:21 in /var/db/dhcpd_leases ...
	I0213 16:04:31.254966   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Found 44 entries in /var/db/dhcpd_leases!
	I0213 16:04:31.254988   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.45 HWAddress:9e:48:95:9f:7c:44 ID:1,9e:48:95:9f:7c:44 Lease:0x65cc0382}
	I0213 16:04:31.255024   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.44 HWAddress:ba:b2:e8:3b:2f:ed ID:1,ba:b2:e8:3b:2f:ed Lease:0x65cd528c}
	I0213 16:04:31.255035   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.43 HWAddress:22:34:c6:19:4c:2e ID:1,22:34:c6:19:4c:2e Lease:0x65cd524f}
	I0213 16:04:31.255043   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.42 HWAddress:ce:91:bb:8e:b0:c5 ID:1,ce:91:bb:8e:b0:c5 Lease:0x65cd504a}
	I0213 16:04:31.255053   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.41 HWAddress:d2:c4:90:cf:75:d2 ID:1,d2:c4:90:cf:75:d2 Lease:0x65cd5059}
	I0213 16:04:31.255068   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.40 HWAddress:a2:fe:3f:a3:bd:5d ID:1,a2:fe:3f:a3:bd:5d Lease:0x65cd4f9a}
	I0213 16:04:31.255082   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.39 HWAddress:ea:4d:7a:fd:a5:26 ID:1,ea:4d:7a:fd:a5:26 Lease:0x65cd4f44}
	I0213 16:04:31.255092   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.38 HWAddress:ce:fc:41:79:7e:81 ID:1,ce:fc:41:79:7e:81 Lease:0x65cd4f35}
	I0213 16:04:31.255100   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.37 HWAddress:8e:65:f:4b:58:d4 ID:1,8e:65:f:4b:58:d4 Lease:0x65cd4eeb}
	I0213 16:04:31.255126   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.36 HWAddress:12:37:2b:87:75:f2 ID:1,12:37:2b:87:75:f2 Lease:0x65cd4edf}
	I0213 16:04:31.255142   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.35 HWAddress:42:5c:64:90:9a:96 ID:1,42:5c:64:90:9a:96 Lease:0x65cd4e83}
	I0213 16:04:31.255154   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.34 HWAddress:4e:43:dd:6e:d0:a4 ID:1,4e:43:dd:6e:d0:a4 Lease:0x65cd4e67}
	I0213 16:04:31.255166   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.33 HWAddress:c2:bc:f:d2:56:37 ID:1,c2:bc:f:d2:56:37 Lease:0x65cd4e21}
	I0213 16:04:31.255175   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.32 HWAddress:82:56:84:93:20:76 ID:1,82:56:84:93:20:76 Lease:0x65cd4e15}
	I0213 16:04:31.255183   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.31 HWAddress:86:68:12:94:e3:53 ID:1,86:68:12:94:e3:53 Lease:0x65cbfc90}
	I0213 16:04:31.255193   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.30 HWAddress:5a:9f:f7:d:31:86 ID:1,5a:9f:f7:d:31:86 Lease:0x65cbfc64}
	I0213 16:04:31.255201   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.29 HWAddress:2:da:94:be:eb:78 ID:1,2:da:94:be:eb:78 Lease:0x65cd4dad}
	I0213 16:04:31.255209   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.28 HWAddress:52:b1:93:68:d8:c9 ID:1,52:b1:93:68:d8:c9 Lease:0x65cd4d7e}
	I0213 16:04:31.255222   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.27 HWAddress:5e:f1:37:cb:1:ab ID:1,5e:f1:37:cb:1:ab Lease:0x65cd4d6f}
	I0213 16:04:31.255230   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.26 HWAddress:d2:58:e8:51:28:f4 ID:1,d2:58:e8:51:28:f4 Lease:0x65cd4cd6}
	I0213 16:04:31.255247   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:e2:b0:82:5e:37:7f ID:1,e2:b0:82:5e:37:7f Lease:0x65cd4c85}
	I0213 16:04:31.255258   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:1e:1c:11:76:5:69 ID:1,1e:1c:11:76:5:69 Lease:0x65cd4c55}
	I0213 16:04:31.255266   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:82:d6:3f:b2:ee:db ID:1,82:d6:3f:b2:ee:db Lease:0x65cbfaca}
	I0213 16:04:31.255274   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:36:e0:4a:c7:a:0 ID:1,36:e0:4a:c7:a:0 Lease:0x65cd4c07}
	I0213 16:04:31.255282   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:6:b2:24:3d:c0:93 ID:1,6:b2:24:3d:c0:93 Lease:0x65cd4be8}
	I0213 16:04:31.255290   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:56:5d:32:d9:58:54 ID:1,56:5d:32:d9:58:54 Lease:0x65cd4bce}
	I0213 16:04:31.255298   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:fa:8e:7c:83:4e:d9 ID:1,fa:8e:7c:83:4e:d9 Lease:0x65cd4b56}
	I0213 16:04:31.255306   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:2a:8f:bb:99:b7:16 ID:1,2a:8f:bb:99:b7:16 Lease:0x65cd4ae3}
	I0213 16:04:31.255314   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:1a:a1:86:8f:2b:ed ID:1,1a:a1:86:8f:2b:ed Lease:0x65cd4aaf}
	I0213 16:04:31.255322   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:42:72:7:ca:56:e ID:1,42:72:7:ca:56:e Lease:0x65cbf89a}
	I0213 16:04:31.255333   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:6e:db:44:6f:36:dd ID:1,6e:db:44:6f:36:dd Lease:0x65cbf810}
	I0213 16:04:31.255342   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6a:bc:38:de:80:fc ID:1,6a:bc:38:de:80:fc Lease:0x65cd49df}
	I0213 16:04:31.255349   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:94:35:f5:ee:46 ID:1,52:94:35:f5:ee:46 Lease:0x65cd49ab}
	I0213 16:04:31.255358   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:da:34:1f:c5:ee:5 ID:1,da:34:1f:c5:ee:5 Lease:0x65cbf65c}
	I0213 16:04:31.255365   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:fa:67:c7:f6:f1:e5 ID:1,fa:67:c7:f6:f1:e5 Lease:0x65cbf645}
	I0213 16:04:31.255374   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:26:3b:e2:e2:a5:a7 ID:1,26:3b:e2:e2:a5:a7 Lease:0x65cd477c}
	I0213 16:04:31.255385   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:d2:14:90:81:4:9d ID:1,d2:14:90:81:4:9d Lease:0x65cd4756}
	I0213 16:04:31.255393   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:7a:28:c5:9e:43:85 ID:1,7a:28:c5:9e:43:85 Lease:0x65cd4719}
	I0213 16:04:31.255401   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:4a:cd:b6:27:72:f6 ID:1,4a:cd:b6:27:72:f6 Lease:0x65cd4693}
	I0213 16:04:31.255408   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:66:d4:88:4d:a4:a6 ID:1,66:d4:88:4d:a4:a6 Lease:0x65cd464d}
	I0213 16:04:31.255415   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:22:72:80:65:85:f8 ID:1,22:72:80:65:85:f8 Lease:0x65cd4555}
	I0213 16:04:31.255423   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:4a:db:1c:ff:88:80 ID:1,4a:db:1c:ff:88:80 Lease:0x65cd4526}
	I0213 16:04:31.255431   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:72:e4:3c:27:f8:90 ID:1,72:e4:3c:27:f8:90 Lease:0x65cd43c3}
	I0213 16:04:31.255440   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name: IPAddress:192.169.0.2 HWAddress:a2:7f:c4:fc:f5:19 ID:1,a2:7f:c4:fc:f5:19 Lease:0x65cd40a2}
	I0213 16:04:33.255225   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Attempt 4
	I0213 16:04:33.255240   11066 main.go:141] libmachine: (newest-cni-173000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 16:04:33.255295   11066 main.go:141] libmachine: (newest-cni-173000) DBG | hyperkit pid from json: 11145
	I0213 16:04:33.256142   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Searching for e6:f9:f6:2c:6f:21 in /var/db/dhcpd_leases ...
	I0213 16:04:33.256214   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Found 44 entries in /var/db/dhcpd_leases!
	I0213 16:04:33.256231   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.45 HWAddress:9e:48:95:9f:7c:44 ID:1,9e:48:95:9f:7c:44 Lease:0x65cc0382}
	I0213 16:04:33.256242   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.44 HWAddress:ba:b2:e8:3b:2f:ed ID:1,ba:b2:e8:3b:2f:ed Lease:0x65cd528c}
	I0213 16:04:33.256252   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.43 HWAddress:22:34:c6:19:4c:2e ID:1,22:34:c6:19:4c:2e Lease:0x65cd524f}
	I0213 16:04:33.256260   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.42 HWAddress:ce:91:bb:8e:b0:c5 ID:1,ce:91:bb:8e:b0:c5 Lease:0x65cd504a}
	I0213 16:04:33.256268   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.41 HWAddress:d2:c4:90:cf:75:d2 ID:1,d2:c4:90:cf:75:d2 Lease:0x65cd5059}
	I0213 16:04:33.256276   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.40 HWAddress:a2:fe:3f:a3:bd:5d ID:1,a2:fe:3f:a3:bd:5d Lease:0x65cd4f9a}
	I0213 16:04:33.256285   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.39 HWAddress:ea:4d:7a:fd:a5:26 ID:1,ea:4d:7a:fd:a5:26 Lease:0x65cd4f44}
	I0213 16:04:33.256293   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.38 HWAddress:ce:fc:41:79:7e:81 ID:1,ce:fc:41:79:7e:81 Lease:0x65cd4f35}
	I0213 16:04:33.256300   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.37 HWAddress:8e:65:f:4b:58:d4 ID:1,8e:65:f:4b:58:d4 Lease:0x65cd4eeb}
	I0213 16:04:33.256307   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.36 HWAddress:12:37:2b:87:75:f2 ID:1,12:37:2b:87:75:f2 Lease:0x65cd4edf}
	I0213 16:04:33.256315   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.35 HWAddress:42:5c:64:90:9a:96 ID:1,42:5c:64:90:9a:96 Lease:0x65cd4e83}
	I0213 16:04:33.256329   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.34 HWAddress:4e:43:dd:6e:d0:a4 ID:1,4e:43:dd:6e:d0:a4 Lease:0x65cd4e67}
	I0213 16:04:33.256339   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.33 HWAddress:c2:bc:f:d2:56:37 ID:1,c2:bc:f:d2:56:37 Lease:0x65cd4e21}
	I0213 16:04:33.256345   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.32 HWAddress:82:56:84:93:20:76 ID:1,82:56:84:93:20:76 Lease:0x65cd4e15}
	I0213 16:04:33.256353   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.31 HWAddress:86:68:12:94:e3:53 ID:1,86:68:12:94:e3:53 Lease:0x65cbfc90}
	I0213 16:04:33.256368   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.30 HWAddress:5a:9f:f7:d:31:86 ID:1,5a:9f:f7:d:31:86 Lease:0x65cbfc64}
	I0213 16:04:33.256379   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.29 HWAddress:2:da:94:be:eb:78 ID:1,2:da:94:be:eb:78 Lease:0x65cd4dad}
	I0213 16:04:33.256387   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.28 HWAddress:52:b1:93:68:d8:c9 ID:1,52:b1:93:68:d8:c9 Lease:0x65cd4d7e}
	I0213 16:04:33.256395   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.27 HWAddress:5e:f1:37:cb:1:ab ID:1,5e:f1:37:cb:1:ab Lease:0x65cd4d6f}
	I0213 16:04:33.256403   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.26 HWAddress:d2:58:e8:51:28:f4 ID:1,d2:58:e8:51:28:f4 Lease:0x65cd4cd6}
	I0213 16:04:33.256411   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:e2:b0:82:5e:37:7f ID:1,e2:b0:82:5e:37:7f Lease:0x65cd4c85}
	I0213 16:04:33.256419   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:1e:1c:11:76:5:69 ID:1,1e:1c:11:76:5:69 Lease:0x65cd4c55}
	I0213 16:04:33.256427   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:82:d6:3f:b2:ee:db ID:1,82:d6:3f:b2:ee:db Lease:0x65cbfaca}
	I0213 16:04:33.256433   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:36:e0:4a:c7:a:0 ID:1,36:e0:4a:c7:a:0 Lease:0x65cd4c07}
	I0213 16:04:33.256446   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:6:b2:24:3d:c0:93 ID:1,6:b2:24:3d:c0:93 Lease:0x65cd4be8}
	I0213 16:04:33.256467   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:56:5d:32:d9:58:54 ID:1,56:5d:32:d9:58:54 Lease:0x65cd4bce}
	I0213 16:04:33.256475   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:fa:8e:7c:83:4e:d9 ID:1,fa:8e:7c:83:4e:d9 Lease:0x65cd4b56}
	I0213 16:04:33.256482   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:2a:8f:bb:99:b7:16 ID:1,2a:8f:bb:99:b7:16 Lease:0x65cd4ae3}
	I0213 16:04:33.256491   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:1a:a1:86:8f:2b:ed ID:1,1a:a1:86:8f:2b:ed Lease:0x65cd4aaf}
	I0213 16:04:33.256499   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:42:72:7:ca:56:e ID:1,42:72:7:ca:56:e Lease:0x65cbf89a}
	I0213 16:04:33.256508   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:6e:db:44:6f:36:dd ID:1,6e:db:44:6f:36:dd Lease:0x65cbf810}
	I0213 16:04:33.256516   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6a:bc:38:de:80:fc ID:1,6a:bc:38:de:80:fc Lease:0x65cd49df}
	I0213 16:04:33.256525   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:94:35:f5:ee:46 ID:1,52:94:35:f5:ee:46 Lease:0x65cd49ab}
	I0213 16:04:33.256532   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:da:34:1f:c5:ee:5 ID:1,da:34:1f:c5:ee:5 Lease:0x65cbf65c}
	I0213 16:04:33.256543   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:fa:67:c7:f6:f1:e5 ID:1,fa:67:c7:f6:f1:e5 Lease:0x65cbf645}
	I0213 16:04:33.256551   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:26:3b:e2:e2:a5:a7 ID:1,26:3b:e2:e2:a5:a7 Lease:0x65cd477c}
	I0213 16:04:33.256559   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:d2:14:90:81:4:9d ID:1,d2:14:90:81:4:9d Lease:0x65cd4756}
	I0213 16:04:33.256567   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:7a:28:c5:9e:43:85 ID:1,7a:28:c5:9e:43:85 Lease:0x65cd4719}
	I0213 16:04:33.256575   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:4a:cd:b6:27:72:f6 ID:1,4a:cd:b6:27:72:f6 Lease:0x65cd4693}
	I0213 16:04:33.256583   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:66:d4:88:4d:a4:a6 ID:1,66:d4:88:4d:a4:a6 Lease:0x65cd464d}
	I0213 16:04:33.256592   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:22:72:80:65:85:f8 ID:1,22:72:80:65:85:f8 Lease:0x65cd4555}
	I0213 16:04:33.256605   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:4a:db:1c:ff:88:80 ID:1,4a:db:1c:ff:88:80 Lease:0x65cd4526}
	I0213 16:04:33.256614   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:72:e4:3c:27:f8:90 ID:1,72:e4:3c:27:f8:90 Lease:0x65cd43c3}
	I0213 16:04:33.256623   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name: IPAddress:192.169.0.2 HWAddress:a2:7f:c4:fc:f5:19 ID:1,a2:7f:c4:fc:f5:19 Lease:0x65cd40a2}
	I0213 16:04:35.257303   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Attempt 5
	I0213 16:04:35.257384   11066 main.go:141] libmachine: (newest-cni-173000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 16:04:35.257511   11066 main.go:141] libmachine: (newest-cni-173000) DBG | hyperkit pid from json: 11145
	I0213 16:04:35.258981   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Searching for e6:f9:f6:2c:6f:21 in /var/db/dhcpd_leases ...
	I0213 16:04:35.259131   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Found 45 entries in /var/db/dhcpd_leases!
	I0213 16:04:35.259148   11066 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.46 HWAddress:e6:f9:f6:2c:6f:21 ID:1,e6:f9:f6:2c:6f:21 Lease:0x65cd5511}
	I0213 16:04:35.259167   11066 main.go:141] libmachine: (newest-cni-173000) DBG | Found match: e6:f9:f6:2c:6f:21
	I0213 16:04:35.259177   11066 main.go:141] libmachine: (newest-cni-173000) DBG | IP: 192.169.0.46
	I0213 16:04:35.259227   11066 main.go:141] libmachine: (newest-cni-173000) Calling .GetConfigRaw
	I0213 16:04:35.260020   11066 main.go:141] libmachine: (newest-cni-173000) Calling .DriverName
	I0213 16:04:35.260168   11066 main.go:141] libmachine: (newest-cni-173000) Calling .DriverName
	I0213 16:04:35.260309   11066 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0213 16:04:35.260322   11066 main.go:141] libmachine: (newest-cni-173000) Calling .GetState
	I0213 16:04:35.260470   11066 main.go:141] libmachine: (newest-cni-173000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 16:04:35.260521   11066 main.go:141] libmachine: (newest-cni-173000) DBG | hyperkit pid from json: 11145
	I0213 16:04:35.261538   11066 main.go:141] libmachine: Detecting operating system of created instance...
	I0213 16:04:35.261547   11066 main.go:141] libmachine: Waiting for SSH to be available...
	I0213 16:04:35.261552   11066 main.go:141] libmachine: Getting to WaitForSSH function...
	I0213 16:04:35.261557   11066 main.go:141] libmachine: (newest-cni-173000) Calling .GetSSHHostname
	I0213 16:04:35.261649   11066 main.go:141] libmachine: (newest-cni-173000) Calling .GetSSHPort
	I0213 16:04:35.261756   11066 main.go:141] libmachine: (newest-cni-173000) Calling .GetSSHKeyPath
	I0213 16:04:35.261848   11066 main.go:141] libmachine: (newest-cni-173000) Calling .GetSSHKeyPath
	I0213 16:04:35.261948   11066 main.go:141] libmachine: (newest-cni-173000) Calling .GetSSHUsername
	I0213 16:04:35.262074   11066 main.go:141] libmachine: Using SSH client type: native
	I0213 16:04:35.262358   11066 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1407080] 0x1409d60 <nil>  [] 0s} 192.169.0.46 22 <nil> <nil>}
	I0213 16:04:35.262366   11066 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0213 16:04:55.942747   11066 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.45:22: connect: operation timed out
	I0213 16:05:50.265131   11066 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.46:22: connect: operation timed out
	I0213 16:06:13.947318   11066 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.45:22: connect: operation timed out
	I0213 16:07:08.269131   11066 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.46:22: connect: operation timed out
	I0213 16:07:31.949511   11066 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.45:22: connect: operation timed out
	I0213 16:08:26.270672   11066 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.46:22: connect: operation timed out
	I0213 16:08:49.952311   11066 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.45:22: connect: operation timed out
	I0213 16:09:44.273645   11066 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.46:22: connect: operation timed out
	I0213 16:10:07.954647   11066 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.45:22: connect: operation timed out
	I0213 16:10:24.469118   11066 start.go:128] duration metric: createHost completed in 6m0.032548346s
	I0213 16:10:24.469187   11066 start.go:83] releasing machines lock for "newest-cni-173000", held for 6m0.032707157s
	W0213 16:10:24.469288   11066 out.go:239] * Failed to start hyperkit VM. Running "minikube delete -p newest-cni-173000" may fix it: creating host: create host timed out in 360.000000 seconds
	* Failed to start hyperkit VM. Running "minikube delete -p newest-cni-173000" may fix it: creating host: create host timed out in 360.000000 seconds
	I0213 16:10:24.491037   11066 out.go:177] 
	W0213 16:10:24.512729   11066 out.go:239] X Exiting due to DRV_CREATE_TIMEOUT: Failed to start host: creating host: create host timed out in 360.000000 seconds
	X Exiting due to DRV_CREATE_TIMEOUT: Failed to start host: creating host: create host timed out in 360.000000 seconds
	W0213 16:10:24.512781   11066 out.go:239] * Suggestion: Try 'minikube delete', and disable any conflicting VPN or firewall software
	* Suggestion: Try 'minikube delete', and disable any conflicting VPN or firewall software
	W0213 16:10:24.512804   11066 out.go:239] * Related issue: https://github.com/kubernetes/minikube/issues/7072
	* Related issue: https://github.com/kubernetes/minikube/issues/7072
	I0213 16:10:24.555795   11066 out.go:177] 

                                                
                                                
** /stderr **
start_stop_delete_test.go:188: failed starting minikube -first start-. args "out/minikube-darwin-amd64 start -p newest-cni-173000 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=10.42.0.0/16 --driver=hyperkit  --kubernetes-version=v1.29.0-rc.2": exit status 52
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p newest-cni-173000 -n newest-cni-173000
E0213 16:10:25.227859    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/flannel-599000/client.crt: no such file or directory
E0213 16:10:26.527422    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/auto-599000/client.crt: no such file or directory
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p newest-cni-173000 -n newest-cni-173000: exit status 3 (1m15.092817965s)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E0213 16:11:39.733106   11180 status.go:376] failed to get storage capacity of /var: NewSession: new client: new client: dial tcp 192.169.0.46:22: connect: operation timed out
	E0213 16:11:39.733127   11180 status.go:249] status error: NewSession: new client: new client: dial tcp 192.169.0.46:22: connect: operation timed out

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 3 (may be ok)
helpers_test.go:241: "newest-cni-173000" host is not running, skipping log retrieval (state="Error")
--- FAIL: TestStartStop/group/newest-cni/serial/FirstStart (802.03s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop (690.18s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
E0213 16:01:48.268356    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/flannel-599000/client.crt: no such file or directory
E0213 16:02:04.534702    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kubenet-599000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
E0213 16:02:16.925824    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/calico-599000/client.crt: no such file or directory
E0213 16:02:25.146098    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/custom-flannel-599000/client.crt: no such file or directory
E0213 16:02:30.925781    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/bridge-599000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
E0213 16:02:39.009012    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
E0213 16:03:27.585518    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kubenet-599000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
E0213 16:03:39.255320    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/no-preload-355000/client.crt: no such file or directory
E0213 16:03:50.876602    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/false-599000/client.crt: no such file or directory
E0213 16:03:53.842745    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/old-k8s-version-481000/client.crt: no such file or directory
E0213 16:04:01.440415    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/enable-default-cni-599000/client.crt: no such file or directory
E0213 16:04:04.021104    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
E0213 16:04:05.736062    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
E0213 16:04:35.966932    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
E0213 16:05:00.044770    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
E0213 16:05:25.222037    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/flannel-599000/client.crt: no such file or directory
E0213 16:05:26.518890    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/auto-599000/client.crt: no such file or directory
E0213 16:05:27.072925    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
E0213 16:05:46.818093    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kindnet-599000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
E0213 16:06:07.858511    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/bridge-599000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
E0213 16:07:04.542827    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kubenet-599000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
E0213 16:07:16.933239    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/calico-599000/client.crt: no such file or directory
E0213 16:07:25.153351    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/custom-flannel-599000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
E0213 16:08:39.262928    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/no-preload-355000/client.crt: no such file or directory
E0213 16:08:50.886277    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/false-599000/client.crt: no such file or directory
E0213 16:08:53.851415    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/old-k8s-version-481000/client.crt: no such file or directory
E0213 16:09:01.448060    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/enable-default-cni-599000/client.crt: no such file or directory
E0213 16:09:04.028888    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
E0213 16:09:05.743395    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
E0213 16:09:35.975233    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
E0213 16:10:00.053879    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
E0213 16:10:02.319143    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/no-preload-355000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
E0213 16:10:16.897403    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/old-k8s-version-481000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": context deadline exceeded
start_stop_delete_test.go:274: ***** TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop: pod "k8s-app=kubernetes-dashboard" failed to start within 9m0s: context deadline exceeded ****
start_stop_delete_test.go:274: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p default-k8s-diff-port-603000 -n default-k8s-diff-port-603000
E0213 16:10:46.825939    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kindnet-599000/client.crt: no such file or directory
E0213 16:11:07.866831    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/bridge-599000/client.crt: no such file or directory
E0213 16:11:23.105602    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
start_stop_delete_test.go:274: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p default-k8s-diff-port-603000 -n default-k8s-diff-port-603000: exit status 3 (1m15.089864256s)

                                                
                                                
-- stdout --
	Nonexistent

                                                
                                                
-- /stdout --
** stderr ** 
	E0213 16:11:52.925747   11187 status.go:376] failed to get storage capacity of /var: NewSession: new client: new client: dial tcp 192.169.0.44:22: connect: operation timed out
	E0213 16:11:52.925771   11187 status.go:249] status error: NewSession: new client: new client: dial tcp 192.169.0.44:22: connect: operation timed out

                                                
                                                
** /stderr **
start_stop_delete_test.go:274: status error: exit status 3 (may be ok)
start_stop_delete_test.go:274: "default-k8s-diff-port-603000" apiserver is not running, skipping kubectl commands (state="Nonexistent")
start_stop_delete_test.go:275: failed waiting for 'addon dashboard' pod post-stop-start: k8s-app=kubernetes-dashboard within 9m0s: context deadline exceeded
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-diff-port-603000 -n default-k8s-diff-port-603000
E0213 16:12:04.549881    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kubenet-599000/client.crt: no such file or directory
E0213 16:12:16.942240    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/calico-599000/client.crt: no such file or directory
E0213 16:12:25.161481    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/custom-flannel-599000/client.crt: no such file or directory
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-diff-port-603000 -n default-k8s-diff-port-603000: exit status 3 (1m15.086032354s)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E0213 16:13:08.015027   11201 status.go:376] failed to get storage capacity of /var: NewSession: new client: new client: dial tcp 192.169.0.44:22: connect: operation timed out
	E0213 16:13:08.015039   11201 status.go:249] status error: NewSession: new client: new client: dial tcp 192.169.0.44:22: connect: operation timed out

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 3 (may be ok)
helpers_test.go:241: "default-k8s-diff-port-603000" host is not running, skipping log retrieval (state="Error")
--- FAIL: TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop (690.18s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/EnableAddonWhileActive (225.28s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p newest-cni-173000 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:205: (dbg) Non-zero exit: out/minikube-darwin-amd64 addons enable metrics-server -p newest-cni-173000 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain: exit status 11 (2m30.189624209s)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to MK_ADDON_ENABLE_PAUSED: enabled failed: check paused: list paused: docker: NewSession: new client: new client: dial tcp 192.169.0.46:22: connect: operation timed out
	* 
	╭───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                                                           │
	│    * If the above advice does not help, please let us know:                                                               │
	│      https://github.com/kubernetes/minikube/issues/new/choose                                                             │
	│                                                                                                                           │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.                                  │
	│    * Please also attach the following file to the GitHub issue:                                                           │
	│    * - /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube_addons_2bafae6fa40fec163538f94366e390b0317a8b15_0.log    │
	│                                                                                                                           │
	╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
start_stop_delete_test.go:207: failed to enable an addon post-stop. args "out/minikube-darwin-amd64 addons enable metrics-server -p newest-cni-173000 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain": exit status 11
start_stop_delete_test.go:211: WARNING: cni mode requires additional setup before pods can schedule :(
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p newest-cni-173000 -n newest-cni-173000
E0213 16:14:35.982686    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p newest-cni-173000 -n newest-cni-173000: exit status 3 (1m15.090455289s)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E0213 16:15:25.019577   11222 status.go:376] failed to get storage capacity of /var: NewSession: new client: new client: dial tcp 192.169.0.46:22: connect: operation timed out
	E0213 16:15:25.019606   11222 status.go:249] status error: NewSession: new client: new client: dial tcp 192.169.0.46:22: connect: operation timed out

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 3 (may be ok)
helpers_test.go:241: "newest-cni-173000" host is not running, skipping log retrieval (state="Error")
--- FAIL: TestStartStop/group/newest-cni/serial/EnableAddonWhileActive (225.28s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop (690.18s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
E0213 16:13:29.584206    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/auto-599000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
E0213 16:13:39.271498    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/no-preload-355000/client.crt: no such file or directory
E0213 16:13:49.882421    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kindnet-599000/client.crt: no such file or directory
E0213 16:13:50.891910    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/false-599000/client.crt: no such file or directory
E0213 16:13:53.857068    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/old-k8s-version-481000/client.crt: no such file or directory
E0213 16:14:01.456726    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/enable-default-cni-599000/client.crt: no such file or directory
E0213 16:14:04.035562    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
E0213 16:14:05.751308    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
E0213 16:15:00.061221    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
E0213 16:15:20.001976    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/calico-599000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
E0213 16:15:46.833625    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kindnet-599000/client.crt: no such file or directory
E0213 16:16:07.874833    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/bridge-599000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
E0213 16:16:53.944340    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/false-599000/client.crt: no such file or directory
E0213 16:17:04.515131    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/enable-default-cni-599000/client.crt: no such file or directory
E0213 16:17:04.557897    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kubenet-599000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
E0213 16:17:16.948772    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/calico-599000/client.crt: no such file or directory
E0213 16:17:25.169194    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/custom-flannel-599000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
E0213 16:18:28.294648    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/flannel-599000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
E0213 16:18:39.278369    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/no-preload-355000/client.crt: no such file or directory
E0213 16:18:50.900971    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/false-599000/client.crt: no such file or directory
E0213 16:18:53.866008    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/old-k8s-version-481000/client.crt: no such file or directory
E0213 16:19:01.463704    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/enable-default-cni-599000/client.crt: no such file or directory
E0213 16:19:04.042647    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
E0213 16:19:05.760667    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
E0213 16:19:10.953837    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/bridge-599000/client.crt: no such file or directory
E0213 16:19:19.036508    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
E0213 16:19:35.990254    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
E0213 16:20:00.067959    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
E0213 16:20:07.611524    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kubenet-599000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
E0213 16:20:25.243302    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/flannel-599000/client.crt: no such file or directory
E0213 16:20:26.541470    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/auto-599000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
E0213 16:20:46.841312    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kindnet-599000/client.crt: no such file or directory
E0213 16:21:07.883346    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/bridge-599000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.44:8444: i/o timeout
E0213 16:22:04.565138    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kubenet-599000/client.crt: no such file or directory
E0213 16:22:07.098880    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.44:8444/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": context deadline exceeded
start_stop_delete_test.go:287: ***** TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop: pod "k8s-app=kubernetes-dashboard" failed to start within 9m0s: context deadline exceeded ****
start_stop_delete_test.go:287: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p default-k8s-diff-port-603000 -n default-k8s-diff-port-603000
E0213 16:22:16.956515    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/calico-599000/client.crt: no such file or directory
E0213 16:22:25.176507    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/custom-flannel-599000/client.crt: no such file or directory
start_stop_delete_test.go:287: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p default-k8s-diff-port-603000 -n default-k8s-diff-port-603000: exit status 3 (1m15.090251337s)

                                                
                                                
-- stdout --
	Nonexistent

                                                
                                                
-- /stdout --
** stderr ** 
	E0213 16:23:23.123743   11302 status.go:376] failed to get storage capacity of /var: NewSession: new client: new client: dial tcp 192.169.0.44:22: connect: operation timed out
	E0213 16:23:23.123763   11302 status.go:249] status error: NewSession: new client: new client: dial tcp 192.169.0.44:22: connect: operation timed out

                                                
                                                
** /stderr **
start_stop_delete_test.go:287: status error: exit status 3 (may be ok)
start_stop_delete_test.go:287: "default-k8s-diff-port-603000" apiserver is not running, skipping kubectl commands (state="Nonexistent")
start_stop_delete_test.go:288: failed waiting for 'addon dashboard' pod post-stop-start: k8s-app=kubernetes-dashboard within 9m0s: context deadline exceeded
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context default-k8s-diff-port-603000 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
start_stop_delete_test.go:291: (dbg) Non-zero exit: kubectl --context default-k8s-diff-port-603000 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard: context deadline exceeded (1.071µs)
start_stop_delete_test.go:293: failed to get info on kubernetes-dashboard deployments. args "kubectl --context default-k8s-diff-port-603000 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard": context deadline exceeded
start_stop_delete_test.go:297: addon did not load correct image. Expected to contain " registry.k8s.io/echoserver:1.4". Addon deployment info: 
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-diff-port-603000 -n default-k8s-diff-port-603000
E0213 16:23:39.286200    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/no-preload-355000/client.crt: no such file or directory
E0213 16:23:50.908329    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/false-599000/client.crt: no such file or directory
E0213 16:23:53.873064    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/old-k8s-version-481000/client.crt: no such file or directory
E0213 16:24:01.472555    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/enable-default-cni-599000/client.crt: no such file or directory
E0213 16:24:04.050237    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
E0213 16:24:05.766722    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
E0213 16:24:35.997879    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-diff-port-603000 -n default-k8s-diff-port-603000: exit status 3 (1m15.085617074s)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E0213 16:24:38.212034   11313 status.go:376] failed to get storage capacity of /var: NewSession: new client: new client: dial tcp 192.169.0.44:22: connect: operation timed out
	E0213 16:24:38.212048   11313 status.go:249] status error: NewSession: new client: new client: dial tcp 192.169.0.44:22: connect: operation timed out

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 3 (may be ok)
helpers_test.go:241: "default-k8s-diff-port-603000" host is not running, skipping log retrieval (state="Error")
--- FAIL: TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop (690.18s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/SecondStart (846.22s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p newest-cni-173000 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=10.42.0.0/16 --driver=hyperkit  --kubernetes-version=v1.29.0-rc.2
E0213 16:15:28.217065    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/custom-flannel-599000/client.crt: no such file or directory
E0213 16:15:28.810104    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
start_stop_delete_test.go:256: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p newest-cni-173000 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=10.42.0.0/16 --driver=hyperkit  --kubernetes-version=v1.29.0-rc.2: signal: killed (12m51.130406767s)

                                                
                                                
-- stdout --
	* [newest-cni-173000] minikube v1.32.0 on Darwin 14.3.1
	  - MINIKUBE_LOCATION=18169
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18169-2790/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18169-2790/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on existing profile
	* Starting control plane node newest-cni-173000 in cluster newest-cni-173000
	* Restarting existing hyperkit VM for "newest-cni-173000" ...

                                                
                                                
-- /stdout --
** stderr ** 
	I0213 16:15:26.642139   11246 out.go:291] Setting OutFile to fd 1 ...
	I0213 16:15:26.642399   11246 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0213 16:15:26.642405   11246 out.go:304] Setting ErrFile to fd 2...
	I0213 16:15:26.642409   11246 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0213 16:15:26.642589   11246 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18169-2790/.minikube/bin
	I0213 16:15:26.643946   11246 out.go:298] Setting JSON to false
	I0213 16:15:26.667969   11246 start.go:128] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":5900,"bootTime":1707863826,"procs":448,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.3.1","kernelVersion":"23.3.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0213 16:15:26.668079   11246 start.go:136] gopshost.Virtualization returned error: not implemented yet
	I0213 16:15:26.689026   11246 out.go:177] * [newest-cni-173000] minikube v1.32.0 on Darwin 14.3.1
	I0213 16:15:26.711005   11246 out.go:177]   - MINIKUBE_LOCATION=18169
	I0213 16:15:26.711030   11246 notify.go:220] Checking for updates...
	I0213 16:15:26.753974   11246 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/18169-2790/kubeconfig
	I0213 16:15:26.774892   11246 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0213 16:15:26.818173   11246 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0213 16:15:26.861006   11246 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18169-2790/.minikube
	I0213 16:15:26.902883   11246 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0213 16:15:26.924543   11246 config.go:182] Loaded profile config "newest-cni-173000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.29.0-rc.2
	I0213 16:15:26.925025   11246 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 16:15:26.925087   11246 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 16:15:26.933577   11246 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:57654
	I0213 16:15:26.934013   11246 main.go:141] libmachine: () Calling .GetVersion
	I0213 16:15:26.934448   11246 main.go:141] libmachine: Using API Version  1
	I0213 16:15:26.934459   11246 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 16:15:26.934739   11246 main.go:141] libmachine: () Calling .GetMachineName
	I0213 16:15:26.934852   11246 main.go:141] libmachine: (newest-cni-173000) Calling .DriverName
	I0213 16:15:26.935048   11246 driver.go:392] Setting default libvirt URI to qemu:///system
	I0213 16:15:26.935291   11246 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 16:15:26.935315   11246 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 16:15:26.942982   11246 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:57656
	I0213 16:15:26.943325   11246 main.go:141] libmachine: () Calling .GetVersion
	I0213 16:15:26.943687   11246 main.go:141] libmachine: Using API Version  1
	I0213 16:15:26.943704   11246 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 16:15:26.943917   11246 main.go:141] libmachine: () Calling .GetMachineName
	I0213 16:15:26.944019   11246 main.go:141] libmachine: (newest-cni-173000) Calling .DriverName
	I0213 16:15:26.973085   11246 out.go:177] * Using the hyperkit driver based on existing profile
	I0213 16:15:26.994037   11246 start.go:298] selected driver: hyperkit
	I0213 16:15:26.994065   11246 start.go:902] validating driver "hyperkit" against &{Name:newest-cni-173000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/17866/minikube-v1.32.1-1703784139-17866-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1704759386-17866@sha256:8c3c33047f9bc285e1f5f2a5aa14744a2fe04c58478f02f77b06169dea8dd3f0 Memory:2200 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 Kubernet
esConfig:{KubernetesVersion:v1.29.0-rc.2 ClusterName:newest-cni-173000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates:ServerSideApply=true ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:kubeadm Key:pod-network-cidr Value:10.42.0.0/16}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.29.0-rc.2 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[dashboard:true] CustomAddonImages:map[MetricsScraper:registry.k8s.io/echoserver:1.4] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:false default_sa:true extra:false kubelet:false node_ready:false system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:fal
se ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs:}
	I0213 16:15:26.994224   11246 start.go:913] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0213 16:15:26.999807   11246 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0213 16:15:27.000326   11246 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/18169-2790/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0213 16:15:27.008654   11246 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.32.0
	I0213 16:15:27.013113   11246 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 16:15:27.013138   11246 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0213 16:15:27.013300   11246 start_flags.go:946] Waiting for components: map[apiserver:true apps_running:false default_sa:true extra:false kubelet:false node_ready:false system_pods:true]
	I0213 16:15:27.013362   11246 cni.go:84] Creating CNI manager for ""
	I0213 16:15:27.013374   11246 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I0213 16:15:27.013386   11246 start_flags.go:321] config:
	{Name:newest-cni-173000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/17866/minikube-v1.32.1-1703784139-17866-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1704759386-17866@sha256:8c3c33047f9bc285e1f5f2a5aa14744a2fe04c58478f02f77b06169dea8dd3f0 Memory:2200 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.29.0-rc.2 ClusterName:newest-cni-173000 Namespace:d
efault APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates:ServerSideApply=true ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:kubeadm Key:pod-network-cidr Value:10.42.0.0/16}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.29.0-rc.2 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[dashboard:true] CustomAddonImages:map[MetricsScraper:registry.k8s.io/echoserver:1.4] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:false default_sa:true extra:false kubelet:false node_ready:false system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube
-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs:}
	I0213 16:15:27.013533   11246 iso.go:125] acquiring lock: {Name:mk11c32e346f5bc1f067dee24ee83d9969db3d82 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0213 16:15:27.055961   11246 out.go:177] * Starting control plane node newest-cni-173000 in cluster newest-cni-173000
	I0213 16:15:27.076768   11246 preload.go:132] Checking if preload exists for k8s version v1.29.0-rc.2 and runtime docker
	I0213 16:15:27.077263   11246 preload.go:148] Found local preload: /Users/jenkins/minikube-integration/18169-2790/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.29.0-rc.2-docker-overlay2-amd64.tar.lz4
	I0213 16:15:27.077297   11246 cache.go:56] Caching tarball of preloaded images
	I0213 16:15:27.077456   11246 preload.go:174] Found /Users/jenkins/minikube-integration/18169-2790/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.29.0-rc.2-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0213 16:15:27.077468   11246 cache.go:59] Finished verifying existence of preloaded tar for  v1.29.0-rc.2 on docker
	I0213 16:15:27.077634   11246 profile.go:148] Saving config to /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/newest-cni-173000/config.json ...
	I0213 16:15:27.078343   11246 start.go:365] acquiring machines lock for newest-cni-173000: {Name:mke947868f35224fa4aab1d5f0a66de1e12a8270 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0213 16:15:27.078737   11246 start.go:369] acquired machines lock for "newest-cni-173000" in 372.786µs
	I0213 16:15:27.078774   11246 start.go:96] Skipping create...Using existing machine configuration
	I0213 16:15:27.078815   11246 fix.go:54] fixHost starting: 
	I0213 16:15:27.079112   11246 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 16:15:27.079141   11246 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 16:15:27.087424   11246 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:57658
	I0213 16:15:27.087760   11246 main.go:141] libmachine: () Calling .GetVersion
	I0213 16:15:27.088115   11246 main.go:141] libmachine: Using API Version  1
	I0213 16:15:27.088126   11246 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 16:15:27.088378   11246 main.go:141] libmachine: () Calling .GetMachineName
	I0213 16:15:27.088508   11246 main.go:141] libmachine: (newest-cni-173000) Calling .DriverName
	I0213 16:15:27.088606   11246 main.go:141] libmachine: (newest-cni-173000) Calling .GetState
	I0213 16:15:27.088701   11246 main.go:141] libmachine: (newest-cni-173000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 16:15:27.088773   11246 main.go:141] libmachine: (newest-cni-173000) DBG | hyperkit pid from json: 11145
	I0213 16:15:27.089740   11246 main.go:141] libmachine: (newest-cni-173000) DBG | hyperkit pid 11145 missing from process table
	I0213 16:15:27.089777   11246 fix.go:102] recreateIfNeeded on newest-cni-173000: state=Stopped err=<nil>
	I0213 16:15:27.089798   11246 main.go:141] libmachine: (newest-cni-173000) Calling .DriverName
	W0213 16:15:27.089889   11246 fix.go:128] unexpected machine state, will restart: <nil>
	I0213 16:15:27.111062   11246 out.go:177] * Restarting existing hyperkit VM for "newest-cni-173000" ...
	I0213 16:15:27.132872   11246 main.go:141] libmachine: (newest-cni-173000) Calling .Start
	I0213 16:15:27.133060   11246 main.go:141] libmachine: (newest-cni-173000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 16:15:27.133084   11246 main.go:141] libmachine: (newest-cni-173000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/hyperkit.pid
	I0213 16:15:27.133164   11246 main.go:141] libmachine: (newest-cni-173000) DBG | Using UUID 4b16baee-b9c8-419f-8f65-773ebb2e9bdb
	I0213 16:15:27.158920   11246 main.go:141] libmachine: (newest-cni-173000) DBG | Generated MAC e6:f9:f6:2c:6f:21
	I0213 16:15:27.158946   11246 main.go:141] libmachine: (newest-cni-173000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=newest-cni-173000
	I0213 16:15:27.159062   11246 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:15:27 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"4b16baee-b9c8-419f-8f65-773ebb2e9bdb", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc000357c20)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.
Process)(nil)}
	I0213 16:15:27.159098   11246 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:15:27 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"4b16baee-b9c8-419f-8f65-773ebb2e9bdb", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc000357c20)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.
Process)(nil)}
	I0213 16:15:27.159198   11246 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:15:27 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "4b16baee-b9c8-419f-8f65-773ebb2e9bdb", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/newest-cni-173000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/tty,log=/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/bzimage,/Users/jenkins/minikube-integrat
ion/18169-2790/.minikube/machines/newest-cni-173000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=newest-cni-173000"}
	I0213 16:15:27.159253   11246 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:15:27 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 4b16baee-b9c8-419f-8f65-773ebb2e9bdb -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/newest-cni-173000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/tty,log=/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/console-ring -f kexec,/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/bzimage,/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/initrd,ear
lyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=newest-cni-173000"
	I0213 16:15:27.159269   11246 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:15:27 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0213 16:15:27.160551   11246 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:15:27 DEBUG: hyperkit: Pid is 11257
	I0213 16:15:27.161391   11246 main.go:141] libmachine: (newest-cni-173000) DBG | Attempt 0
	I0213 16:15:27.161408   11246 main.go:141] libmachine: (newest-cni-173000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 16:15:27.161495   11246 main.go:141] libmachine: (newest-cni-173000) DBG | hyperkit pid from json: 11257
	I0213 16:15:27.163571   11246 main.go:141] libmachine: (newest-cni-173000) DBG | Searching for e6:f9:f6:2c:6f:21 in /var/db/dhcpd_leases ...
	I0213 16:15:27.163657   11246 main.go:141] libmachine: (newest-cni-173000) DBG | Found 45 entries in /var/db/dhcpd_leases!
	I0213 16:15:27.163670   11246 main.go:141] libmachine: (newest-cni-173000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.46 HWAddress:e6:f9:f6:2c:6f:21 ID:1,e6:f9:f6:2c:6f:21 Lease:0x65cc061d}
	I0213 16:15:27.163686   11246 main.go:141] libmachine: (newest-cni-173000) DBG | Found match: e6:f9:f6:2c:6f:21
	I0213 16:15:27.163696   11246 main.go:141] libmachine: (newest-cni-173000) DBG | IP: 192.169.0.46
	I0213 16:15:27.163768   11246 main.go:141] libmachine: (newest-cni-173000) Calling .GetConfigRaw
	I0213 16:15:27.164555   11246 main.go:141] libmachine: (newest-cni-173000) Calling .GetIP
	I0213 16:15:27.164744   11246 profile.go:148] Saving config to /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/newest-cni-173000/config.json ...
	I0213 16:15:27.165268   11246 machine.go:88] provisioning docker machine ...
	I0213 16:15:27.165283   11246 main.go:141] libmachine: (newest-cni-173000) Calling .DriverName
	I0213 16:15:27.165400   11246 main.go:141] libmachine: (newest-cni-173000) Calling .GetMachineName
	I0213 16:15:27.165507   11246 buildroot.go:166] provisioning hostname "newest-cni-173000"
	I0213 16:15:27.165518   11246 main.go:141] libmachine: (newest-cni-173000) Calling .GetMachineName
	I0213 16:15:27.165607   11246 main.go:141] libmachine: (newest-cni-173000) Calling .GetSSHHostname
	I0213 16:15:27.165706   11246 main.go:141] libmachine: (newest-cni-173000) Calling .GetSSHPort
	I0213 16:15:27.165796   11246 main.go:141] libmachine: (newest-cni-173000) Calling .GetSSHKeyPath
	I0213 16:15:27.165929   11246 main.go:141] libmachine: (newest-cni-173000) Calling .GetSSHKeyPath
	I0213 16:15:27.166033   11246 main.go:141] libmachine: (newest-cni-173000) Calling .GetSSHUsername
	I0213 16:15:27.166586   11246 main.go:141] libmachine: Using SSH client type: native
	I0213 16:15:27.166921   11246 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1407080] 0x1409d60 <nil>  [] 0s} 192.169.0.46 22 <nil> <nil>}
	I0213 16:15:27.166933   11246 main.go:141] libmachine: About to run SSH command:
	sudo hostname newest-cni-173000 && echo "newest-cni-173000" | sudo tee /etc/hostname
	I0213 16:15:27.168292   11246 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:15:27 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0213 16:15:27.176340   11246 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:15:27 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18169-2790/.minikube/machines/newest-cni-173000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0213 16:15:27.177312   11246 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:15:27 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0213 16:15:27.177337   11246 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:15:27 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0213 16:15:27.177345   11246 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:15:27 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0213 16:15:27.177354   11246 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:15:27 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0213 16:15:27.545365   11246 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:15:27 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0213 16:15:27.545381   11246 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:15:27 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0213 16:15:27.649386   11246 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:15:27 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0213 16:15:27.649407   11246 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:15:27 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0213 16:15:27.649418   11246 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:15:27 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0213 16:15:27.649432   11246 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:15:27 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0213 16:15:27.650315   11246 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:15:27 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0213 16:15:27.650329   11246 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:15:27 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0213 16:15:32.585426   11246 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:15:32 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0213 16:15:32.585550   11246 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:15:32 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0213 16:15:32.585562   11246 main.go:141] libmachine: (newest-cni-173000) DBG | 2024/02/13 16:15:32 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0213 16:16:42.169080   11246 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.46:22: connect: operation timed out
	I0213 16:18:00.172713   11246 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.46:22: connect: operation timed out
	I0213 16:19:18.176891   11246 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.46:22: connect: operation timed out
	I0213 16:20:36.180903   11246 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.46:22: connect: operation timed out
	I0213 16:21:54.184472   11246 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.46:22: connect: operation timed out
	I0213 16:23:12.189389   11246 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.46:22: connect: operation timed out
	I0213 16:24:30.193043   11246 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.46:22: connect: operation timed out
	I0213 16:25:48.194594   11246 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.46:22: connect: operation timed out
	I0213 16:27:06.238822   11246 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.46:22: connect: operation timed out

                                                
                                                
** /stderr **
start_stop_delete_test.go:259: failed to start minikube post-stop. args "out/minikube-darwin-amd64 start -p newest-cni-173000 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=10.42.0.0/16 --driver=hyperkit  --kubernetes-version=v1.29.0-rc.2": signal: killed
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p newest-cni-173000 -n newest-cni-173000
E0213 16:28:17.874187    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/default-k8s-diff-port-603000/client.crt: no such file or directory
E0213 16:28:19.154441    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/default-k8s-diff-port-603000/client.crt: no such file or directory
E0213 16:28:21.715842    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/default-k8s-diff-port-603000/client.crt: no such file or directory
E0213 16:28:26.837286    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/default-k8s-diff-port-603000/client.crt: no such file or directory
E0213 16:28:37.078098    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/default-k8s-diff-port-603000/client.crt: no such file or directory
E0213 16:28:39.338288    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/no-preload-355000/client.crt: no such file or directory
E0213 16:28:50.960515    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/false-599000/client.crt: no such file or directory
E0213 16:28:53.925018    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/old-k8s-version-481000/client.crt: no such file or directory
E0213 16:28:57.560922    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/default-k8s-diff-port-603000/client.crt: no such file or directory
E0213 16:29:01.524530    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/enable-default-cni-599000/client.crt: no such file or directory
E0213 16:29:04.102347    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
E0213 16:29:05.818565    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p newest-cni-173000 -n newest-cni-173000: exit status 3 (1m15.092041099s)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E0213 16:29:32.875549   11371 status.go:376] failed to get storage capacity of /var: NewSession: new client: new client: dial tcp 192.169.0.46:22: connect: operation timed out
	E0213 16:29:32.875571   11371 status.go:249] status error: NewSession: new client: new client: dial tcp 192.169.0.46:22: connect: operation timed out

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 3 (may be ok)
helpers_test.go:241: "newest-cni-173000" host is not running, skipping log retrieval (state="Error")
--- FAIL: TestStartStop/group/newest-cni/serial/SecondStart (846.22s)

                                                
                                    

Test pass (300/328)

Order passed test Duration
3 TestDownloadOnly/v1.16.0/json-events 38.06
4 TestDownloadOnly/v1.16.0/preload-exists 0
7 TestDownloadOnly/v1.16.0/kubectl 0
8 TestDownloadOnly/v1.16.0/LogsDuration 0.34
9 TestDownloadOnly/v1.16.0/DeleteAll 0.39
10 TestDownloadOnly/v1.16.0/DeleteAlwaysSucceeds 0.37
12 TestDownloadOnly/v1.28.4/json-events 23.97
13 TestDownloadOnly/v1.28.4/preload-exists 0
16 TestDownloadOnly/v1.28.4/kubectl 0
17 TestDownloadOnly/v1.28.4/LogsDuration 0.38
18 TestDownloadOnly/v1.28.4/DeleteAll 0.39
19 TestDownloadOnly/v1.28.4/DeleteAlwaysSucceeds 0.37
21 TestDownloadOnly/v1.29.0-rc.2/json-events 20.29
22 TestDownloadOnly/v1.29.0-rc.2/preload-exists 0
25 TestDownloadOnly/v1.29.0-rc.2/kubectl 0
26 TestDownloadOnly/v1.29.0-rc.2/LogsDuration 0.44
27 TestDownloadOnly/v1.29.0-rc.2/DeleteAll 0.42
28 TestDownloadOnly/v1.29.0-rc.2/DeleteAlwaysSucceeds 0.38
30 TestBinaryMirror 1.07
31 TestOffline 57.75
34 TestAddons/PreSetup/EnablingAddonOnNonExistingCluster 0.16
35 TestAddons/PreSetup/DisablingAddonOnNonExistingCluster 0.16
36 TestAddons/Setup 210.44
38 TestAddons/parallel/Registry 20.23
39 TestAddons/parallel/Ingress 20.45
40 TestAddons/parallel/InspektorGadget 10.57
41 TestAddons/parallel/MetricsServer 5.54
42 TestAddons/parallel/HelmTiller 10.35
44 TestAddons/parallel/CSI 58.47
45 TestAddons/parallel/Headlamp 13.16
46 TestAddons/parallel/CloudSpanner 5.41
47 TestAddons/parallel/LocalPath 59.4
48 TestAddons/parallel/NvidiaDevicePlugin 5.35
49 TestAddons/parallel/Yakd 5
52 TestAddons/serial/GCPAuth/Namespaces 0.09
53 TestAddons/StoppedEnableDisable 5.76
54 TestCertOptions 39.64
55 TestCertExpiration 241.55
56 TestDockerFlags 38.17
57 TestForceSystemdFlag 39.82
58 TestForceSystemdEnv 156.96
61 TestHyperKitDriverInstallOrUpdate 8.76
64 TestErrorSpam/setup 34.9
65 TestErrorSpam/start 1.51
66 TestErrorSpam/status 0.51
67 TestErrorSpam/pause 1.35
68 TestErrorSpam/unpause 1.34
69 TestErrorSpam/stop 5.69
72 TestFunctional/serial/CopySyncFile 0
73 TestFunctional/serial/StartWithProxy 55.42
74 TestFunctional/serial/AuditLog 0
75 TestFunctional/serial/SoftStart 40.87
76 TestFunctional/serial/KubeContext 0.04
77 TestFunctional/serial/KubectlGetPods 0.08
80 TestFunctional/serial/CacheCmd/cache/add_remote 10.24
81 TestFunctional/serial/CacheCmd/cache/add_local 1.74
82 TestFunctional/serial/CacheCmd/cache/CacheDelete 0.08
83 TestFunctional/serial/CacheCmd/cache/list 0.08
84 TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node 0.2
85 TestFunctional/serial/CacheCmd/cache/cache_reload 2.56
86 TestFunctional/serial/CacheCmd/cache/delete 0.16
87 TestFunctional/serial/MinikubeKubectlCmd 1.12
88 TestFunctional/serial/MinikubeKubectlCmdDirectly 1.53
89 TestFunctional/serial/ExtraConfig 35.58
90 TestFunctional/serial/ComponentHealth 0.08
91 TestFunctional/serial/LogsCmd 3.25
92 TestFunctional/serial/LogsFileCmd 3.31
93 TestFunctional/serial/InvalidService 4.25
95 TestFunctional/parallel/ConfigCmd 0.51
96 TestFunctional/parallel/DashboardCmd 10.43
97 TestFunctional/parallel/DryRun 0.93
98 TestFunctional/parallel/InternationalLanguage 0.47
99 TestFunctional/parallel/StatusCmd 0.54
103 TestFunctional/parallel/ServiceCmdConnect 7.57
104 TestFunctional/parallel/AddonsCmd 0.26
105 TestFunctional/parallel/PersistentVolumeClaim 30.01
107 TestFunctional/parallel/SSHCmd 0.29
108 TestFunctional/parallel/CpCmd 0.93
109 TestFunctional/parallel/MySQL 28.93
110 TestFunctional/parallel/FileSync 0.17
111 TestFunctional/parallel/CertSync 1.07
115 TestFunctional/parallel/NodeLabels 0.09
117 TestFunctional/parallel/NonActiveRuntimeDisabled 0.13
119 TestFunctional/parallel/License 1.54
120 TestFunctional/parallel/Version/short 0.1
121 TestFunctional/parallel/Version/components 0.37
122 TestFunctional/parallel/ImageCommands/ImageListShort 0.19
123 TestFunctional/parallel/ImageCommands/ImageListTable 0.16
124 TestFunctional/parallel/ImageCommands/ImageListJson 0.16
125 TestFunctional/parallel/ImageCommands/ImageListYaml 0.17
126 TestFunctional/parallel/ImageCommands/ImageBuild 6.91
127 TestFunctional/parallel/ImageCommands/Setup 5.92
128 TestFunctional/parallel/DockerEnv/bash 0.71
129 TestFunctional/parallel/UpdateContextCmd/no_changes 0.19
130 TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster 0.21
131 TestFunctional/parallel/UpdateContextCmd/no_clusters 0.19
132 TestFunctional/parallel/ImageCommands/ImageLoadDaemon 3.94
133 TestFunctional/parallel/ImageCommands/ImageReloadDaemon 2.16
134 TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon 8.67
135 TestFunctional/parallel/ImageCommands/ImageSaveToFile 1.07
136 TestFunctional/parallel/ImageCommands/ImageRemove 0.34
137 TestFunctional/parallel/ImageCommands/ImageLoadFromFile 1.4
138 TestFunctional/parallel/ImageCommands/ImageSaveDaemon 0.92
139 TestFunctional/parallel/ServiceCmd/DeployApp 16.12
141 TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel 0.43
142 TestFunctional/parallel/TunnelCmd/serial/StartTunnel 0.02
144 TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup 10.14
145 TestFunctional/parallel/ServiceCmd/List 0.37
146 TestFunctional/parallel/ServiceCmd/JSONOutput 0.37
147 TestFunctional/parallel/ServiceCmd/HTTPS 0.24
148 TestFunctional/parallel/ServiceCmd/Format 0.24
149 TestFunctional/parallel/ServiceCmd/URL 0.25
150 TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP 0.05
151 TestFunctional/parallel/TunnelCmd/serial/AccessDirect 0.02
152 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig 0.04
153 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil 0.03
154 TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS 0.02
155 TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel 0.13
156 TestFunctional/parallel/ProfileCmd/profile_not_create 0.38
157 TestFunctional/parallel/ProfileCmd/profile_list 0.28
158 TestFunctional/parallel/ProfileCmd/profile_json_output 0.29
159 TestFunctional/parallel/MountCmd/any-port 10.9
160 TestFunctional/parallel/MountCmd/specific-port 1.77
161 TestFunctional/parallel/MountCmd/VerifyCleanup 1.89
162 TestFunctional/delete_addon-resizer_images 0.22
163 TestFunctional/delete_my-image_image 0.05
164 TestFunctional/delete_minikube_cached_images 0.05
168 TestImageBuild/serial/Setup 36.87
169 TestImageBuild/serial/NormalBuild 5.38
170 TestImageBuild/serial/BuildWithBuildArg 0.72
171 TestImageBuild/serial/BuildWithDockerIgnore 0.27
172 TestImageBuild/serial/BuildWithSpecifiedDockerfile 0.21
175 TestIngressAddonLegacy/StartLegacyK8sCluster 94.31
177 TestIngressAddonLegacy/serial/ValidateIngressAddonActivation 19.29
178 TestIngressAddonLegacy/serial/ValidateIngressDNSAddonActivation 0.54
179 TestIngressAddonLegacy/serial/ValidateIngressAddons 36.79
182 TestJSONOutput/start/Command 49.74
183 TestJSONOutput/start/Audit 0
185 TestJSONOutput/start/parallel/DistinctCurrentSteps 0
186 TestJSONOutput/start/parallel/IncreasingCurrentSteps 0
188 TestJSONOutput/pause/Command 0.49
189 TestJSONOutput/pause/Audit 0
191 TestJSONOutput/pause/parallel/DistinctCurrentSteps 0
192 TestJSONOutput/pause/parallel/IncreasingCurrentSteps 0
194 TestJSONOutput/unpause/Command 0.44
195 TestJSONOutput/unpause/Audit 0
197 TestJSONOutput/unpause/parallel/DistinctCurrentSteps 0
198 TestJSONOutput/unpause/parallel/IncreasingCurrentSteps 0
200 TestJSONOutput/stop/Command 8.15
201 TestJSONOutput/stop/Audit 0
203 TestJSONOutput/stop/parallel/DistinctCurrentSteps 0
204 TestJSONOutput/stop/parallel/IncreasingCurrentSteps 0
205 TestErrorJSONOutput 0.78
210 TestMainNoArgs 0.08
211 TestMinikubeProfile 85.76
214 TestMountStart/serial/StartWithMountFirst 16.77
215 TestMountStart/serial/VerifyMountFirst 0.31
216 TestMountStart/serial/StartWithMountSecond 16.79
217 TestMountStart/serial/VerifyMountSecond 0.31
218 TestMountStart/serial/DeleteFirst 2.41
219 TestMountStart/serial/VerifyMountPostDelete 0.31
220 TestMountStart/serial/Stop 2.25
221 TestMountStart/serial/RestartStopped 18.01
222 TestMountStart/serial/VerifyMountPostStop 0.29
225 TestMultiNode/serial/FreshStart2Nodes 101.03
226 TestMultiNode/serial/DeployApp2Nodes 9.2
227 TestMultiNode/serial/PingHostFrom2Pods 0.9
228 TestMultiNode/serial/AddNode 159.04
229 TestMultiNode/serial/MultiNodeLabels 0.08
230 TestMultiNode/serial/ProfileList 0.32
231 TestMultiNode/serial/CopyFile 5.34
232 TestMultiNode/serial/StopNode 2.69
233 TestMultiNode/serial/StartAfterStop 27.03
234 TestMultiNode/serial/RestartKeepsNodes 127.08
235 TestMultiNode/serial/DeleteNode 3.07
236 TestMultiNode/serial/StopMultiNode 16.49
237 TestMultiNode/serial/RestartMultiNode 80.55
238 TestMultiNode/serial/ValidateNameConflict 40.52
242 TestPreload 180.06
244 TestScheduledStopUnix 106
245 TestSkaffold 128.04
248 TestRunningBinaryUpgrade 113.44
250 TestKubernetesUpgrade 163.09
263 TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current 4.37
264 TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current 7.39
265 TestStoppedBinaryUpgrade/Setup 5
266 TestStoppedBinaryUpgrade/Upgrade 89.06
267 TestStoppedBinaryUpgrade/MinikubeLogs 2.99
269 TestPause/serial/Start 57.5
278 TestNoKubernetes/serial/StartNoK8sWithVersion 0.5
279 TestNoKubernetes/serial/StartWithK8s 38.33
280 TestPause/serial/SecondStartNoReconfiguration 38.9
281 TestNoKubernetes/serial/StartWithStopK8s 7.65
282 TestNoKubernetes/serial/Start 16.26
283 TestNoKubernetes/serial/VerifyK8sNotRunning 0.14
284 TestNoKubernetes/serial/ProfileList 0.53
285 TestNoKubernetes/serial/Stop 8.26
286 TestNoKubernetes/serial/StartNoArgs 15.35
287 TestPause/serial/Pause 0.5
288 TestPause/serial/VerifyStatus 0.16
289 TestPause/serial/Unpause 0.5
290 TestPause/serial/PauseAgain 0.56
291 TestPause/serial/DeletePaused 5.27
292 TestPause/serial/VerifyDeletedResources 0.18
293 TestNetworkPlugins/group/auto/Start 50.4
294 TestNoKubernetes/serial/VerifyK8sNotRunningSecond 0.13
295 TestNetworkPlugins/group/kindnet/Start 65.49
296 TestNetworkPlugins/group/auto/KubeletFlags 0.15
297 TestNetworkPlugins/group/auto/NetCatPod 14.15
298 TestNetworkPlugins/group/auto/DNS 0.16
299 TestNetworkPlugins/group/auto/Localhost 0.11
300 TestNetworkPlugins/group/auto/HairPin 0.11
301 TestNetworkPlugins/group/kindnet/ControllerPod 6
302 TestNetworkPlugins/group/kindnet/KubeletFlags 0.16
303 TestNetworkPlugins/group/kindnet/NetCatPod 15.2
304 TestNetworkPlugins/group/calico/Start 78.75
305 TestNetworkPlugins/group/kindnet/DNS 0.14
306 TestNetworkPlugins/group/kindnet/Localhost 0.11
307 TestNetworkPlugins/group/kindnet/HairPin 0.1
308 TestNetworkPlugins/group/custom-flannel/Start 58.78
309 TestNetworkPlugins/group/calico/ControllerPod 6
310 TestNetworkPlugins/group/calico/KubeletFlags 0.17
311 TestNetworkPlugins/group/calico/NetCatPod 16.14
312 TestNetworkPlugins/group/custom-flannel/KubeletFlags 0.16
313 TestNetworkPlugins/group/custom-flannel/NetCatPod 16.15
314 TestNetworkPlugins/group/calico/DNS 0.13
315 TestNetworkPlugins/group/calico/Localhost 0.11
316 TestNetworkPlugins/group/calico/HairPin 0.11
317 TestNetworkPlugins/group/custom-flannel/DNS 0.13
318 TestNetworkPlugins/group/custom-flannel/Localhost 0.13
319 TestNetworkPlugins/group/custom-flannel/HairPin 0.12
320 TestNetworkPlugins/group/false/Start 52.84
321 TestNetworkPlugins/group/enable-default-cni/Start 61.33
322 TestNetworkPlugins/group/false/KubeletFlags 0.17
323 TestNetworkPlugins/group/false/NetCatPod 15.15
324 TestNetworkPlugins/group/enable-default-cni/KubeletFlags 0.16
325 TestNetworkPlugins/group/enable-default-cni/NetCatPod 15.15
326 TestNetworkPlugins/group/false/DNS 0.12
327 TestNetworkPlugins/group/false/Localhost 0.11
328 TestNetworkPlugins/group/false/HairPin 0.1
329 TestNetworkPlugins/group/enable-default-cni/DNS 0.14
330 TestNetworkPlugins/group/enable-default-cni/Localhost 0.11
331 TestNetworkPlugins/group/enable-default-cni/HairPin 0.11
332 TestNetworkPlugins/group/flannel/Start 61.35
333 TestNetworkPlugins/group/bridge/Start 93.03
334 TestNetworkPlugins/group/flannel/ControllerPod 6
335 TestNetworkPlugins/group/flannel/KubeletFlags 0.17
336 TestNetworkPlugins/group/flannel/NetCatPod 15.15
337 TestNetworkPlugins/group/flannel/DNS 0.13
338 TestNetworkPlugins/group/flannel/Localhost 0.11
339 TestNetworkPlugins/group/flannel/HairPin 0.12
340 TestNetworkPlugins/group/kubenet/Start 59.26
341 TestNetworkPlugins/group/bridge/KubeletFlags 0.16
342 TestNetworkPlugins/group/bridge/NetCatPod 15.15
343 TestNetworkPlugins/group/bridge/DNS 0.12
344 TestNetworkPlugins/group/bridge/Localhost 0.11
345 TestNetworkPlugins/group/bridge/HairPin 0.11
347 TestStartStop/group/old-k8s-version/serial/FirstStart 132.6
348 TestNetworkPlugins/group/kubenet/KubeletFlags 0.17
349 TestNetworkPlugins/group/kubenet/NetCatPod 15.17
350 TestNetworkPlugins/group/kubenet/DNS 0.15
351 TestNetworkPlugins/group/kubenet/Localhost 0.1
352 TestNetworkPlugins/group/kubenet/HairPin 0.11
354 TestStartStop/group/no-preload/serial/FirstStart 61.64
355 TestStartStop/group/no-preload/serial/DeployApp 13.23
356 TestStartStop/group/no-preload/serial/EnableAddonWhileActive 0.76
357 TestStartStop/group/no-preload/serial/Stop 8.27
358 TestStartStop/group/old-k8s-version/serial/DeployApp 13.32
359 TestStartStop/group/no-preload/serial/EnableAddonAfterStop 0.34
360 TestStartStop/group/no-preload/serial/SecondStart 296.86
361 TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive 0.69
362 TestStartStop/group/old-k8s-version/serial/Stop 8.25
363 TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop 0.32
364 TestStartStop/group/old-k8s-version/serial/SecondStart 470.61
365 TestStartStop/group/no-preload/serial/UserAppExistsAfterStop 6
366 TestStartStop/group/no-preload/serial/AddonExistsAfterStop 5.06
367 TestStartStop/group/no-preload/serial/VerifyKubernetesImages 0.17
368 TestStartStop/group/no-preload/serial/Pause 1.9
370 TestStartStop/group/embed-certs/serial/FirstStart 168.22
371 TestStartStop/group/embed-certs/serial/DeployApp 14.23
372 TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop 6.01
373 TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop 5.06
374 TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages 0.17
375 TestStartStop/group/old-k8s-version/serial/Pause 1.77
376 TestStartStop/group/embed-certs/serial/EnableAddonWhileActive 0.81
377 TestStartStop/group/embed-certs/serial/Stop 8.23
379 TestStartStop/group/default-k8s-diff-port/serial/FirstStart 50.34
380 TestStartStop/group/embed-certs/serial/EnableAddonAfterStop 0.32
381 TestStartStop/group/embed-certs/serial/SecondStart 326.19
382 TestStartStop/group/default-k8s-diff-port/serial/DeployApp 13.23
383 TestStartStop/group/default-k8s-diff-port/serial/EnableAddonWhileActive 0.9
384 TestStartStop/group/default-k8s-diff-port/serial/Stop 8.3
385 TestStartStop/group/default-k8s-diff-port/serial/EnableAddonAfterStop 0.32
387 TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop 6.01
388 TestStartStop/group/embed-certs/serial/AddonExistsAfterStop 5.06
389 TestStartStop/group/embed-certs/serial/VerifyKubernetesImages 0.17
390 TestStartStop/group/embed-certs/serial/Pause 1.9
394 TestStartStop/group/newest-cni/serial/DeployApp 0
397 TestStartStop/group/newest-cni/serial/Stop 1.24
398 TestStartStop/group/newest-cni/serial/EnableAddonAfterStop 0.32
x
+
TestDownloadOnly/v1.16.0/json-events (38.06s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/json-events
aaa_download_only_test.go:81: (dbg) Run:  out/minikube-darwin-amd64 start -o=json --download-only -p download-only-895000 --force --alsologtostderr --kubernetes-version=v1.16.0 --container-runtime=docker --driver=hyperkit 
aaa_download_only_test.go:81: (dbg) Done: out/minikube-darwin-amd64 start -o=json --download-only -p download-only-895000 --force --alsologtostderr --kubernetes-version=v1.16.0 --container-runtime=docker --driver=hyperkit : (38.05485788s)
--- PASS: TestDownloadOnly/v1.16.0/json-events (38.06s)

                                                
                                    
x
+
TestDownloadOnly/v1.16.0/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/preload-exists
--- PASS: TestDownloadOnly/v1.16.0/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.16.0/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/kubectl
--- PASS: TestDownloadOnly/v1.16.0/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.16.0/LogsDuration (0.34s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/LogsDuration
aaa_download_only_test.go:184: (dbg) Run:  out/minikube-darwin-amd64 logs -p download-only-895000
aaa_download_only_test.go:184: (dbg) Non-zero exit: out/minikube-darwin-amd64 logs -p download-only-895000: exit status 85 (343.940039ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	| Command |              Args              |       Profile        |  User   | Version |     Start Time      | End Time |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	| start   | -o=json --download-only        | download-only-895000 | jenkins | v1.32.0 | 13 Feb 24 14:49 PST |          |
	|         | -p download-only-895000        |                      |         |         |                     |          |
	|         | --force --alsologtostderr      |                      |         |         |                     |          |
	|         | --kubernetes-version=v1.16.0   |                      |         |         |                     |          |
	|         | --container-runtime=docker     |                      |         |         |                     |          |
	|         | --driver=hyperkit              |                      |         |         |                     |          |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	
	
	==> Last Start <==
	Log file created at: 2024/02/13 14:49:05
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.21.6 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0213 14:49:05.313470    3346 out.go:291] Setting OutFile to fd 1 ...
	I0213 14:49:05.313688    3346 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0213 14:49:05.313694    3346 out.go:304] Setting ErrFile to fd 2...
	I0213 14:49:05.313698    3346 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0213 14:49:05.313906    3346 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18169-2790/.minikube/bin
	W0213 14:49:05.314019    3346 root.go:314] Error reading config file at /Users/jenkins/minikube-integration/18169-2790/.minikube/config/config.json: open /Users/jenkins/minikube-integration/18169-2790/.minikube/config/config.json: no such file or directory
	I0213 14:49:05.315853    3346 out.go:298] Setting JSON to true
	I0213 14:49:05.338512    3346 start.go:128] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":719,"bootTime":1707863826,"procs":434,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.3.1","kernelVersion":"23.3.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0213 14:49:05.338611    3346 start.go:136] gopshost.Virtualization returned error: not implemented yet
	I0213 14:49:05.363910    3346 out.go:97] [download-only-895000] minikube v1.32.0 on Darwin 14.3.1
	I0213 14:49:05.386795    3346 out.go:169] MINIKUBE_LOCATION=18169
	I0213 14:49:05.364155    3346 notify.go:220] Checking for updates...
	W0213 14:49:05.364157    3346 preload.go:295] Failed to list preload files: open /Users/jenkins/minikube-integration/18169-2790/.minikube/cache/preloaded-tarball: no such file or directory
	I0213 14:49:05.431625    3346 out.go:169] KUBECONFIG=/Users/jenkins/minikube-integration/18169-2790/kubeconfig
	I0213 14:49:05.452822    3346 out.go:169] MINIKUBE_BIN=out/minikube-darwin-amd64
	I0213 14:49:05.473783    3346 out.go:169] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0213 14:49:05.494811    3346 out.go:169] MINIKUBE_HOME=/Users/jenkins/minikube-integration/18169-2790/.minikube
	W0213 14:49:05.536700    3346 out.go:267] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I0213 14:49:05.537038    3346 driver.go:392] Setting default libvirt URI to qemu:///system
	I0213 14:49:05.614765    3346 out.go:97] Using the hyperkit driver based on user configuration
	I0213 14:49:05.614790    3346 start.go:298] selected driver: hyperkit
	I0213 14:49:05.614798    3346 start.go:902] validating driver "hyperkit" against <nil>
	I0213 14:49:05.614917    3346 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0213 14:49:05.615095    3346 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/18169-2790/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0213 14:49:05.872261    3346 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.32.0
	I0213 14:49:05.877489    3346 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 14:49:05.877512    3346 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0213 14:49:05.877537    3346 start_flags.go:307] no existing cluster config was found, will generate one from the flags 
	I0213 14:49:05.880344    3346 start_flags.go:392] Using suggested 6000MB memory alloc based on sys=32768MB, container=0MB
	I0213 14:49:05.880478    3346 start_flags.go:909] Wait components to verify : map[apiserver:true system_pods:true]
	I0213 14:49:05.880538    3346 cni.go:84] Creating CNI manager for ""
	I0213 14:49:05.880551    3346 cni.go:162] CNI unnecessary in this configuration, recommending no CNI
	I0213 14:49:05.880562    3346 start_flags.go:321] config:
	{Name:download-only-895000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1704759386-17866@sha256:8c3c33047f9bc285e1f5f2a5aa14744a2fe04c58478f02f77b06169dea8dd3f0 Memory:6000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.16.0 ClusterName:download-only-895000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Container
Runtime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs:}
	I0213 14:49:05.880833    3346 iso.go:125] acquiring lock: {Name:mk11c32e346f5bc1f067dee24ee83d9969db3d82 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0213 14:49:05.902518    3346 out.go:97] Downloading VM boot image ...
	I0213 14:49:05.902571    3346 download.go:107] Downloading: https://storage.googleapis.com/minikube-builds/iso/17866/minikube-v1.32.1-1703784139-17866-amd64.iso?checksum=file:https://storage.googleapis.com/minikube-builds/iso/17866/minikube-v1.32.1-1703784139-17866-amd64.iso.sha256 -> /Users/jenkins/minikube-integration/18169-2790/.minikube/cache/iso/amd64/minikube-v1.32.1-1703784139-17866-amd64.iso
	I0213 14:49:22.571764    3346 out.go:97] Starting control plane node download-only-895000 in cluster download-only-895000
	I0213 14:49:22.571792    3346 preload.go:132] Checking if preload exists for k8s version v1.16.0 and runtime docker
	I0213 14:49:22.843416    3346 preload.go:119] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.16.0/preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4
	I0213 14:49:22.843452    3346 cache.go:56] Caching tarball of preloaded images
	I0213 14:49:22.843689    3346 preload.go:132] Checking if preload exists for k8s version v1.16.0 and runtime docker
	I0213 14:49:22.864408    3346 out.go:97] Downloading Kubernetes v1.16.0 preload ...
	I0213 14:49:22.864423    3346 preload.go:238] getting checksum for preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4 ...
	I0213 14:49:23.447653    3346 download.go:107] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.16.0/preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4?checksum=md5:326f3ce331abb64565b50b8c9e791244 -> /Users/jenkins/minikube-integration/18169-2790/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4
	
	
	* The control plane node "" does not exist.
	  To start a cluster, run: "minikube start -p download-only-895000"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:185: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.16.0/LogsDuration (0.34s)

                                                
                                    
x
+
TestDownloadOnly/v1.16.0/DeleteAll (0.39s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/DeleteAll
aaa_download_only_test.go:197: (dbg) Run:  out/minikube-darwin-amd64 delete --all
--- PASS: TestDownloadOnly/v1.16.0/DeleteAll (0.39s)

                                                
                                    
x
+
TestDownloadOnly/v1.16.0/DeleteAlwaysSucceeds (0.37s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/DeleteAlwaysSucceeds
aaa_download_only_test.go:208: (dbg) Run:  out/minikube-darwin-amd64 delete -p download-only-895000
--- PASS: TestDownloadOnly/v1.16.0/DeleteAlwaysSucceeds (0.37s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.4/json-events (23.97s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.4/json-events
aaa_download_only_test.go:81: (dbg) Run:  out/minikube-darwin-amd64 start -o=json --download-only -p download-only-602000 --force --alsologtostderr --kubernetes-version=v1.28.4 --container-runtime=docker --driver=hyperkit 
aaa_download_only_test.go:81: (dbg) Done: out/minikube-darwin-amd64 start -o=json --download-only -p download-only-602000 --force --alsologtostderr --kubernetes-version=v1.28.4 --container-runtime=docker --driver=hyperkit : (23.96650114s)
--- PASS: TestDownloadOnly/v1.28.4/json-events (23.97s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.4/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.4/preload-exists
--- PASS: TestDownloadOnly/v1.28.4/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.4/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.4/kubectl
--- PASS: TestDownloadOnly/v1.28.4/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.4/LogsDuration (0.38s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.4/LogsDuration
aaa_download_only_test.go:184: (dbg) Run:  out/minikube-darwin-amd64 logs -p download-only-602000
aaa_download_only_test.go:184: (dbg) Non-zero exit: out/minikube-darwin-amd64 logs -p download-only-602000: exit status 85 (374.962189ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------|----------------------|---------|---------|---------------------|---------------------|
	| Command |              Args              |       Profile        |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|---------------------|
	| start   | -o=json --download-only        | download-only-895000 | jenkins | v1.32.0 | 13 Feb 24 14:49 PST |                     |
	|         | -p download-only-895000        |                      |         |         |                     |                     |
	|         | --force --alsologtostderr      |                      |         |         |                     |                     |
	|         | --kubernetes-version=v1.16.0   |                      |         |         |                     |                     |
	|         | --container-runtime=docker     |                      |         |         |                     |                     |
	|         | --driver=hyperkit              |                      |         |         |                     |                     |
	| delete  | --all                          | minikube             | jenkins | v1.32.0 | 13 Feb 24 14:49 PST | 13 Feb 24 14:49 PST |
	| delete  | -p download-only-895000        | download-only-895000 | jenkins | v1.32.0 | 13 Feb 24 14:49 PST | 13 Feb 24 14:49 PST |
	| start   | -o=json --download-only        | download-only-602000 | jenkins | v1.32.0 | 13 Feb 24 14:49 PST |                     |
	|         | -p download-only-602000        |                      |         |         |                     |                     |
	|         | --force --alsologtostderr      |                      |         |         |                     |                     |
	|         | --kubernetes-version=v1.28.4   |                      |         |         |                     |                     |
	|         | --container-runtime=docker     |                      |         |         |                     |                     |
	|         | --driver=hyperkit              |                      |         |         |                     |                     |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/02/13 14:49:44
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.21.6 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0213 14:49:44.481483    3403 out.go:291] Setting OutFile to fd 1 ...
	I0213 14:49:44.481792    3403 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0213 14:49:44.481799    3403 out.go:304] Setting ErrFile to fd 2...
	I0213 14:49:44.481806    3403 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0213 14:49:44.482003    3403 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18169-2790/.minikube/bin
	I0213 14:49:44.483889    3403 out.go:298] Setting JSON to true
	I0213 14:49:44.507446    3403 start.go:128] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":758,"bootTime":1707863826,"procs":432,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.3.1","kernelVersion":"23.3.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0213 14:49:44.507540    3403 start.go:136] gopshost.Virtualization returned error: not implemented yet
	I0213 14:49:44.529139    3403 out.go:97] [download-only-602000] minikube v1.32.0 on Darwin 14.3.1
	I0213 14:49:44.550976    3403 out.go:169] MINIKUBE_LOCATION=18169
	I0213 14:49:44.529335    3403 notify.go:220] Checking for updates...
	I0213 14:49:44.593983    3403 out.go:169] KUBECONFIG=/Users/jenkins/minikube-integration/18169-2790/kubeconfig
	I0213 14:49:44.636881    3403 out.go:169] MINIKUBE_BIN=out/minikube-darwin-amd64
	I0213 14:49:44.678798    3403 out.go:169] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0213 14:49:44.720781    3403 out.go:169] MINIKUBE_HOME=/Users/jenkins/minikube-integration/18169-2790/.minikube
	W0213 14:49:44.764863    3403 out.go:267] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I0213 14:49:44.765266    3403 driver.go:392] Setting default libvirt URI to qemu:///system
	I0213 14:49:44.794770    3403 out.go:97] Using the hyperkit driver based on user configuration
	I0213 14:49:44.794821    3403 start.go:298] selected driver: hyperkit
	I0213 14:49:44.794834    3403 start.go:902] validating driver "hyperkit" against <nil>
	I0213 14:49:44.795063    3403 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0213 14:49:44.795244    3403 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/18169-2790/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0213 14:49:44.803949    3403 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.32.0
	I0213 14:49:44.807763    3403 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 14:49:44.807794    3403 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0213 14:49:44.807824    3403 start_flags.go:307] no existing cluster config was found, will generate one from the flags 
	I0213 14:49:44.810613    3403 start_flags.go:392] Using suggested 6000MB memory alloc based on sys=32768MB, container=0MB
	I0213 14:49:44.810756    3403 start_flags.go:909] Wait components to verify : map[apiserver:true system_pods:true]
	I0213 14:49:44.810812    3403 cni.go:84] Creating CNI manager for ""
	I0213 14:49:44.810825    3403 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I0213 14:49:44.810835    3403 start_flags.go:316] Found "bridge CNI" CNI - setting NetworkPlugin=cni
	I0213 14:49:44.810844    3403 start_flags.go:321] config:
	{Name:download-only-602000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1704759386-17866@sha256:8c3c33047f9bc285e1f5f2a5aa14744a2fe04c58478f02f77b06169dea8dd3f0 Memory:6000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.4 ClusterName:download-only-602000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Container
Runtime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs:}
	I0213 14:49:44.810999    3403 iso.go:125] acquiring lock: {Name:mk11c32e346f5bc1f067dee24ee83d9969db3d82 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0213 14:49:44.831732    3403 out.go:97] Starting control plane node download-only-602000 in cluster download-only-602000
	I0213 14:49:44.831760    3403 preload.go:132] Checking if preload exists for k8s version v1.28.4 and runtime docker
	I0213 14:49:45.269490    3403 preload.go:119] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.28.4/preloaded-images-k8s-v18-v1.28.4-docker-overlay2-amd64.tar.lz4
	I0213 14:49:45.269515    3403 cache.go:56] Caching tarball of preloaded images
	I0213 14:49:45.269686    3403 preload.go:132] Checking if preload exists for k8s version v1.28.4 and runtime docker
	I0213 14:49:45.291391    3403 out.go:97] Downloading Kubernetes v1.28.4 preload ...
	I0213 14:49:45.291407    3403 preload.go:238] getting checksum for preloaded-images-k8s-v18-v1.28.4-docker-overlay2-amd64.tar.lz4 ...
	I0213 14:49:45.869305    3403 download.go:107] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.28.4/preloaded-images-k8s-v18-v1.28.4-docker-overlay2-amd64.tar.lz4?checksum=md5:7ebdea7754e21f51b865dbfc36b53b7d -> /Users/jenkins/minikube-integration/18169-2790/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-docker-overlay2-amd64.tar.lz4
	
	
	* The control plane node "" does not exist.
	  To start a cluster, run: "minikube start -p download-only-602000"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:185: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.28.4/LogsDuration (0.38s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.4/DeleteAll (0.39s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.4/DeleteAll
aaa_download_only_test.go:197: (dbg) Run:  out/minikube-darwin-amd64 delete --all
--- PASS: TestDownloadOnly/v1.28.4/DeleteAll (0.39s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.4/DeleteAlwaysSucceeds (0.37s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.4/DeleteAlwaysSucceeds
aaa_download_only_test.go:208: (dbg) Run:  out/minikube-darwin-amd64 delete -p download-only-602000
--- PASS: TestDownloadOnly/v1.28.4/DeleteAlwaysSucceeds (0.37s)

                                                
                                    
x
+
TestDownloadOnly/v1.29.0-rc.2/json-events (20.29s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.29.0-rc.2/json-events
aaa_download_only_test.go:81: (dbg) Run:  out/minikube-darwin-amd64 start -o=json --download-only -p download-only-034000 --force --alsologtostderr --kubernetes-version=v1.29.0-rc.2 --container-runtime=docker --driver=hyperkit 
aaa_download_only_test.go:81: (dbg) Done: out/minikube-darwin-amd64 start -o=json --download-only -p download-only-034000 --force --alsologtostderr --kubernetes-version=v1.29.0-rc.2 --container-runtime=docker --driver=hyperkit : (20.293030384s)
--- PASS: TestDownloadOnly/v1.29.0-rc.2/json-events (20.29s)

                                                
                                    
x
+
TestDownloadOnly/v1.29.0-rc.2/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.29.0-rc.2/preload-exists
--- PASS: TestDownloadOnly/v1.29.0-rc.2/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.29.0-rc.2/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.29.0-rc.2/kubectl
--- PASS: TestDownloadOnly/v1.29.0-rc.2/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.29.0-rc.2/LogsDuration (0.44s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.29.0-rc.2/LogsDuration
aaa_download_only_test.go:184: (dbg) Run:  out/minikube-darwin-amd64 logs -p download-only-034000
aaa_download_only_test.go:184: (dbg) Non-zero exit: out/minikube-darwin-amd64 logs -p download-only-034000: exit status 85 (443.333486ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	|---------|-----------------------------------|----------------------|---------|---------|---------------------|---------------------|
	| Command |               Args                |       Profile        |  User   | Version |     Start Time      |      End Time       |
	|---------|-----------------------------------|----------------------|---------|---------|---------------------|---------------------|
	| start   | -o=json --download-only           | download-only-895000 | jenkins | v1.32.0 | 13 Feb 24 14:49 PST |                     |
	|         | -p download-only-895000           |                      |         |         |                     |                     |
	|         | --force --alsologtostderr         |                      |         |         |                     |                     |
	|         | --kubernetes-version=v1.16.0      |                      |         |         |                     |                     |
	|         | --container-runtime=docker        |                      |         |         |                     |                     |
	|         | --driver=hyperkit                 |                      |         |         |                     |                     |
	| delete  | --all                             | minikube             | jenkins | v1.32.0 | 13 Feb 24 14:49 PST | 13 Feb 24 14:49 PST |
	| delete  | -p download-only-895000           | download-only-895000 | jenkins | v1.32.0 | 13 Feb 24 14:49 PST | 13 Feb 24 14:49 PST |
	| start   | -o=json --download-only           | download-only-602000 | jenkins | v1.32.0 | 13 Feb 24 14:49 PST |                     |
	|         | -p download-only-602000           |                      |         |         |                     |                     |
	|         | --force --alsologtostderr         |                      |         |         |                     |                     |
	|         | --kubernetes-version=v1.28.4      |                      |         |         |                     |                     |
	|         | --container-runtime=docker        |                      |         |         |                     |                     |
	|         | --driver=hyperkit                 |                      |         |         |                     |                     |
	| delete  | --all                             | minikube             | jenkins | v1.32.0 | 13 Feb 24 14:50 PST | 13 Feb 24 14:50 PST |
	| delete  | -p download-only-602000           | download-only-602000 | jenkins | v1.32.0 | 13 Feb 24 14:50 PST | 13 Feb 24 14:50 PST |
	| start   | -o=json --download-only           | download-only-034000 | jenkins | v1.32.0 | 13 Feb 24 14:50 PST |                     |
	|         | -p download-only-034000           |                      |         |         |                     |                     |
	|         | --force --alsologtostderr         |                      |         |         |                     |                     |
	|         | --kubernetes-version=v1.29.0-rc.2 |                      |         |         |                     |                     |
	|         | --container-runtime=docker        |                      |         |         |                     |                     |
	|         | --driver=hyperkit                 |                      |         |         |                     |                     |
	|---------|-----------------------------------|----------------------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/02/13 14:50:09
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.21.6 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0213 14:50:09.587562    3460 out.go:291] Setting OutFile to fd 1 ...
	I0213 14:50:09.587741    3460 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0213 14:50:09.587746    3460 out.go:304] Setting ErrFile to fd 2...
	I0213 14:50:09.587750    3460 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0213 14:50:09.587936    3460 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18169-2790/.minikube/bin
	I0213 14:50:09.589457    3460 out.go:298] Setting JSON to true
	I0213 14:50:09.612150    3460 start.go:128] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":783,"bootTime":1707863826,"procs":441,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.3.1","kernelVersion":"23.3.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0213 14:50:09.612241    3460 start.go:136] gopshost.Virtualization returned error: not implemented yet
	I0213 14:50:09.634042    3460 out.go:97] [download-only-034000] minikube v1.32.0 on Darwin 14.3.1
	I0213 14:50:09.655604    3460 out.go:169] MINIKUBE_LOCATION=18169
	I0213 14:50:09.634238    3460 notify.go:220] Checking for updates...
	I0213 14:50:09.698590    3460 out.go:169] KUBECONFIG=/Users/jenkins/minikube-integration/18169-2790/kubeconfig
	I0213 14:50:09.740387    3460 out.go:169] MINIKUBE_BIN=out/minikube-darwin-amd64
	I0213 14:50:09.783668    3460 out.go:169] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0213 14:50:09.826552    3460 out.go:169] MINIKUBE_HOME=/Users/jenkins/minikube-integration/18169-2790/.minikube
	W0213 14:50:09.869198    3460 out.go:267] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I0213 14:50:09.869666    3460 driver.go:392] Setting default libvirt URI to qemu:///system
	I0213 14:50:09.899543    3460 out.go:97] Using the hyperkit driver based on user configuration
	I0213 14:50:09.899599    3460 start.go:298] selected driver: hyperkit
	I0213 14:50:09.899611    3460 start.go:902] validating driver "hyperkit" against <nil>
	I0213 14:50:09.899846    3460 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0213 14:50:09.900053    3460 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/18169-2790/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0213 14:50:09.909772    3460 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.32.0
	I0213 14:50:09.913624    3460 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 14:50:09.913646    3460 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0213 14:50:09.913676    3460 start_flags.go:307] no existing cluster config was found, will generate one from the flags 
	I0213 14:50:09.916419    3460 start_flags.go:392] Using suggested 6000MB memory alloc based on sys=32768MB, container=0MB
	I0213 14:50:09.916582    3460 start_flags.go:909] Wait components to verify : map[apiserver:true system_pods:true]
	I0213 14:50:09.916643    3460 cni.go:84] Creating CNI manager for ""
	I0213 14:50:09.916659    3460 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I0213 14:50:09.916674    3460 start_flags.go:316] Found "bridge CNI" CNI - setting NetworkPlugin=cni
	I0213 14:50:09.916683    3460 start_flags.go:321] config:
	{Name:download-only-034000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1704759386-17866@sha256:8c3c33047f9bc285e1f5f2a5aa14744a2fe04c58478f02f77b06169dea8dd3f0 Memory:6000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.29.0-rc.2 ClusterName:download-only-034000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Cont
ainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs:}
	I0213 14:50:09.916840    3460 iso.go:125] acquiring lock: {Name:mk11c32e346f5bc1f067dee24ee83d9969db3d82 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0213 14:50:09.938456    3460 out.go:97] Starting control plane node download-only-034000 in cluster download-only-034000
	I0213 14:50:09.938484    3460 preload.go:132] Checking if preload exists for k8s version v1.29.0-rc.2 and runtime docker
	I0213 14:50:10.374900    3460 preload.go:119] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.29.0-rc.2/preloaded-images-k8s-v18-v1.29.0-rc.2-docker-overlay2-amd64.tar.lz4
	I0213 14:50:10.374942    3460 cache.go:56] Caching tarball of preloaded images
	I0213 14:50:10.375248    3460 preload.go:132] Checking if preload exists for k8s version v1.29.0-rc.2 and runtime docker
	I0213 14:50:10.397802    3460 out.go:97] Downloading Kubernetes v1.29.0-rc.2 preload ...
	I0213 14:50:10.397855    3460 preload.go:238] getting checksum for preloaded-images-k8s-v18-v1.29.0-rc.2-docker-overlay2-amd64.tar.lz4 ...
	I0213 14:50:10.978150    3460 download.go:107] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.29.0-rc.2/preloaded-images-k8s-v18-v1.29.0-rc.2-docker-overlay2-amd64.tar.lz4?checksum=md5:47acda482c3add5b56147c92b8d7f468 -> /Users/jenkins/minikube-integration/18169-2790/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.29.0-rc.2-docker-overlay2-amd64.tar.lz4
	I0213 14:50:27.952353    3460 preload.go:249] saving checksum for preloaded-images-k8s-v18-v1.29.0-rc.2-docker-overlay2-amd64.tar.lz4 ...
	I0213 14:50:27.952534    3460 preload.go:256] verifying checksum of /Users/jenkins/minikube-integration/18169-2790/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.29.0-rc.2-docker-overlay2-amd64.tar.lz4 ...
	I0213 14:50:28.497758    3460 cache.go:59] Finished verifying existence of preloaded tar for  v1.29.0-rc.2 on docker
	I0213 14:50:28.498020    3460 profile.go:148] Saving config to /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/download-only-034000/config.json ...
	I0213 14:50:28.498043    3460 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/download-only-034000/config.json: {Name:mkec4b57ae52a302b2eaaf3db0a935bbcbe83a1a Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0213 14:50:28.498347    3460 preload.go:132] Checking if preload exists for k8s version v1.29.0-rc.2 and runtime docker
	I0213 14:50:28.498562    3460 download.go:107] Downloading: https://dl.k8s.io/release/v1.29.0-rc.2/bin/darwin/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.29.0-rc.2/bin/darwin/amd64/kubectl.sha256 -> /Users/jenkins/minikube-integration/18169-2790/.minikube/cache/darwin/amd64/v1.29.0-rc.2/kubectl
	
	
	* The control plane node "" does not exist.
	  To start a cluster, run: "minikube start -p download-only-034000"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:185: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.29.0-rc.2/LogsDuration (0.44s)

                                                
                                    
x
+
TestDownloadOnly/v1.29.0-rc.2/DeleteAll (0.42s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.29.0-rc.2/DeleteAll
aaa_download_only_test.go:197: (dbg) Run:  out/minikube-darwin-amd64 delete --all
--- PASS: TestDownloadOnly/v1.29.0-rc.2/DeleteAll (0.42s)

                                                
                                    
x
+
TestDownloadOnly/v1.29.0-rc.2/DeleteAlwaysSucceeds (0.38s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.29.0-rc.2/DeleteAlwaysSucceeds
aaa_download_only_test.go:208: (dbg) Run:  out/minikube-darwin-amd64 delete -p download-only-034000
--- PASS: TestDownloadOnly/v1.29.0-rc.2/DeleteAlwaysSucceeds (0.38s)

                                                
                                    
x
+
TestBinaryMirror (1.07s)

                                                
                                                
=== RUN   TestBinaryMirror
aaa_download_only_test.go:314: (dbg) Run:  out/minikube-darwin-amd64 start --download-only -p binary-mirror-844000 --alsologtostderr --binary-mirror http://127.0.0.1:49998 --driver=hyperkit 
helpers_test.go:175: Cleaning up "binary-mirror-844000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p binary-mirror-844000
--- PASS: TestBinaryMirror (1.07s)

                                                
                                    
x
+
TestOffline (57.75s)

                                                
                                                
=== RUN   TestOffline
=== PAUSE TestOffline

                                                
                                                

                                                
                                                
=== CONT  TestOffline
aab_offline_test.go:55: (dbg) Run:  out/minikube-darwin-amd64 start -p offline-docker-587000 --alsologtostderr -v=1 --memory=2048 --wait=true --driver=hyperkit 
aab_offline_test.go:55: (dbg) Done: out/minikube-darwin-amd64 start -p offline-docker-587000 --alsologtostderr -v=1 --memory=2048 --wait=true --driver=hyperkit : (52.481337356s)
helpers_test.go:175: Cleaning up "offline-docker-587000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p offline-docker-587000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p offline-docker-587000: (5.268339404s)
--- PASS: TestOffline (57.75s)

                                                
                                    
x
+
TestAddons/PreSetup/EnablingAddonOnNonExistingCluster (0.16s)

                                                
                                                
=== RUN   TestAddons/PreSetup/EnablingAddonOnNonExistingCluster
=== PAUSE TestAddons/PreSetup/EnablingAddonOnNonExistingCluster

                                                
                                                

                                                
                                                
=== CONT  TestAddons/PreSetup/EnablingAddonOnNonExistingCluster
addons_test.go:928: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p addons-679000
addons_test.go:928: (dbg) Non-zero exit: out/minikube-darwin-amd64 addons enable dashboard -p addons-679000: exit status 85 (163.935274ms)

                                                
                                                
-- stdout --
	* Profile "addons-679000" not found. Run "minikube profile list" to view all profiles.
	  To start a cluster, run: "minikube start -p addons-679000"

                                                
                                                
-- /stdout --
--- PASS: TestAddons/PreSetup/EnablingAddonOnNonExistingCluster (0.16s)

                                                
                                    
x
+
TestAddons/PreSetup/DisablingAddonOnNonExistingCluster (0.16s)

                                                
                                                
=== RUN   TestAddons/PreSetup/DisablingAddonOnNonExistingCluster
=== PAUSE TestAddons/PreSetup/DisablingAddonOnNonExistingCluster

                                                
                                                

                                                
                                                
=== CONT  TestAddons/PreSetup/DisablingAddonOnNonExistingCluster
addons_test.go:939: (dbg) Run:  out/minikube-darwin-amd64 addons disable dashboard -p addons-679000
addons_test.go:939: (dbg) Non-zero exit: out/minikube-darwin-amd64 addons disable dashboard -p addons-679000: exit status 85 (163.927643ms)

                                                
                                                
-- stdout --
	* Profile "addons-679000" not found. Run "minikube profile list" to view all profiles.
	  To start a cluster, run: "minikube start -p addons-679000"

                                                
                                                
-- /stdout --
--- PASS: TestAddons/PreSetup/DisablingAddonOnNonExistingCluster (0.16s)

                                                
                                    
x
+
TestAddons/Setup (210.44s)

                                                
                                                
=== RUN   TestAddons/Setup
addons_test.go:109: (dbg) Run:  out/minikube-darwin-amd64 start -p addons-679000 --wait=true --memory=4000 --alsologtostderr --addons=registry --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --addons=storage-provisioner-rancher --addons=nvidia-device-plugin --addons=yakd --driver=hyperkit  --addons=ingress --addons=ingress-dns --addons=helm-tiller
addons_test.go:109: (dbg) Done: out/minikube-darwin-amd64 start -p addons-679000 --wait=true --memory=4000 --alsologtostderr --addons=registry --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --addons=storage-provisioner-rancher --addons=nvidia-device-plugin --addons=yakd --driver=hyperkit  --addons=ingress --addons=ingress-dns --addons=helm-tiller: (3m30.444768729s)
--- PASS: TestAddons/Setup (210.44s)

                                                
                                    
x
+
TestAddons/parallel/Registry (20.23s)

                                                
                                                
=== RUN   TestAddons/parallel/Registry
=== PAUSE TestAddons/parallel/Registry

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:330: registry stabilized in 9.374002ms
addons_test.go:332: (dbg) TestAddons/parallel/Registry: waiting 6m0s for pods matching "actual-registry=true" in namespace "kube-system" ...
helpers_test.go:344: "registry-8m7rz" [e9aba18a-95f5-4435-846b-24b5759fa8ff] Running
addons_test.go:332: (dbg) TestAddons/parallel/Registry: actual-registry=true healthy within 5.003904422s
addons_test.go:335: (dbg) TestAddons/parallel/Registry: waiting 10m0s for pods matching "registry-proxy=true" in namespace "kube-system" ...
helpers_test.go:344: "registry-proxy-6bvl6" [a5d8f765-7af7-4cf2-8293-2969d0bd6b8e] Running
addons_test.go:335: (dbg) TestAddons/parallel/Registry: registry-proxy=true healthy within 5.00278614s
addons_test.go:340: (dbg) Run:  kubectl --context addons-679000 delete po -l run=registry-test --now
addons_test.go:345: (dbg) Run:  kubectl --context addons-679000 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local"
addons_test.go:345: (dbg) Done: kubectl --context addons-679000 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local": (9.610781524s)
addons_test.go:359: (dbg) Run:  out/minikube-darwin-amd64 -p addons-679000 ip
2024/02/13 14:54:23 [DEBUG] GET http://192.169.0.3:5000
addons_test.go:388: (dbg) Run:  out/minikube-darwin-amd64 -p addons-679000 addons disable registry --alsologtostderr -v=1
--- PASS: TestAddons/parallel/Registry (20.23s)

                                                
                                    
x
+
TestAddons/parallel/Ingress (20.45s)

                                                
                                                
=== RUN   TestAddons/parallel/Ingress
=== PAUSE TestAddons/parallel/Ingress

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Ingress
addons_test.go:207: (dbg) Run:  kubectl --context addons-679000 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s
addons_test.go:232: (dbg) Run:  kubectl --context addons-679000 replace --force -f testdata/nginx-ingress-v1.yaml
addons_test.go:245: (dbg) Run:  kubectl --context addons-679000 replace --force -f testdata/nginx-pod-svc.yaml
addons_test.go:250: (dbg) TestAddons/parallel/Ingress: waiting 8m0s for pods matching "run=nginx" in namespace "default" ...
helpers_test.go:344: "nginx" [4b9eac01-f026-4d2c-a472-07cc82be9964] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:344: "nginx" [4b9eac01-f026-4d2c-a472-07cc82be9964] Running
addons_test.go:250: (dbg) TestAddons/parallel/Ingress: run=nginx healthy within 10.004327026s
addons_test.go:262: (dbg) Run:  out/minikube-darwin-amd64 -p addons-679000 ssh "curl -s http://127.0.0.1/ -H 'Host: nginx.example.com'"
addons_test.go:286: (dbg) Run:  kubectl --context addons-679000 replace --force -f testdata/ingress-dns-example-v1.yaml
addons_test.go:291: (dbg) Run:  out/minikube-darwin-amd64 -p addons-679000 ip
addons_test.go:297: (dbg) Run:  nslookup hello-john.test 192.169.0.3
addons_test.go:306: (dbg) Run:  out/minikube-darwin-amd64 -p addons-679000 addons disable ingress-dns --alsologtostderr -v=1
addons_test.go:306: (dbg) Done: out/minikube-darwin-amd64 -p addons-679000 addons disable ingress-dns --alsologtostderr -v=1: (1.869893899s)
addons_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 -p addons-679000 addons disable ingress --alsologtostderr -v=1
addons_test.go:311: (dbg) Done: out/minikube-darwin-amd64 -p addons-679000 addons disable ingress --alsologtostderr -v=1: (7.517638564s)
--- PASS: TestAddons/parallel/Ingress (20.45s)

                                                
                                    
x
+
TestAddons/parallel/InspektorGadget (10.57s)

                                                
                                                
=== RUN   TestAddons/parallel/InspektorGadget
=== PAUSE TestAddons/parallel/InspektorGadget

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/InspektorGadget
addons_test.go:838: (dbg) TestAddons/parallel/InspektorGadget: waiting 8m0s for pods matching "k8s-app=gadget" in namespace "gadget" ...
helpers_test.go:344: "gadget-cqvqn" [5a49f680-3484-4fd9-8b26-fa2cd9d176d5] Running / Ready:ContainersNotReady (containers with unready status: [gadget]) / ContainersReady:ContainersNotReady (containers with unready status: [gadget])
addons_test.go:838: (dbg) TestAddons/parallel/InspektorGadget: k8s-app=gadget healthy within 5.003004815s
addons_test.go:841: (dbg) Run:  out/minikube-darwin-amd64 addons disable inspektor-gadget -p addons-679000
addons_test.go:841: (dbg) Done: out/minikube-darwin-amd64 addons disable inspektor-gadget -p addons-679000: (5.564534861s)
--- PASS: TestAddons/parallel/InspektorGadget (10.57s)

                                                
                                    
x
+
TestAddons/parallel/MetricsServer (5.54s)

                                                
                                                
=== RUN   TestAddons/parallel/MetricsServer
=== PAUSE TestAddons/parallel/MetricsServer

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/MetricsServer
addons_test.go:407: metrics-server stabilized in 2.458112ms
addons_test.go:409: (dbg) TestAddons/parallel/MetricsServer: waiting 6m0s for pods matching "k8s-app=metrics-server" in namespace "kube-system" ...
helpers_test.go:344: "metrics-server-69cf46c98-5z25g" [4aec0ce2-5207-496d-90c3-4a9bbb03c336] Running
addons_test.go:409: (dbg) TestAddons/parallel/MetricsServer: k8s-app=metrics-server healthy within 5.00539151s
addons_test.go:415: (dbg) Run:  kubectl --context addons-679000 top pods -n kube-system
addons_test.go:432: (dbg) Run:  out/minikube-darwin-amd64 -p addons-679000 addons disable metrics-server --alsologtostderr -v=1
--- PASS: TestAddons/parallel/MetricsServer (5.54s)

                                                
                                    
x
+
TestAddons/parallel/HelmTiller (10.35s)

                                                
                                                
=== RUN   TestAddons/parallel/HelmTiller
=== PAUSE TestAddons/parallel/HelmTiller

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/HelmTiller
addons_test.go:456: tiller-deploy stabilized in 3.243014ms
addons_test.go:458: (dbg) TestAddons/parallel/HelmTiller: waiting 6m0s for pods matching "app=helm" in namespace "kube-system" ...
helpers_test.go:344: "tiller-deploy-7b677967b9-6f5pv" [98beba36-382b-4b94-bc28-0f170736c6ed] Running
addons_test.go:458: (dbg) TestAddons/parallel/HelmTiller: app=helm healthy within 5.003956375s
addons_test.go:473: (dbg) Run:  kubectl --context addons-679000 run --rm helm-test --restart=Never --image=docker.io/alpine/helm:2.16.3 -it --namespace=kube-system -- version
addons_test.go:473: (dbg) Done: kubectl --context addons-679000 run --rm helm-test --restart=Never --image=docker.io/alpine/helm:2.16.3 -it --namespace=kube-system -- version: (4.915594101s)
addons_test.go:490: (dbg) Run:  out/minikube-darwin-amd64 -p addons-679000 addons disable helm-tiller --alsologtostderr -v=1
--- PASS: TestAddons/parallel/HelmTiller (10.35s)

                                                
                                    
x
+
TestAddons/parallel/CSI (58.47s)

                                                
                                                
=== RUN   TestAddons/parallel/CSI
=== PAUSE TestAddons/parallel/CSI

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CSI
addons_test.go:561: csi-hostpath-driver pods stabilized in 9.778878ms
addons_test.go:564: (dbg) Run:  kubectl --context addons-679000 create -f testdata/csi-hostpath-driver/pvc.yaml
addons_test.go:569: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc" in namespace "default" ...
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc hpvc -o jsonpath={.status.phase} -n default
addons_test.go:574: (dbg) Run:  kubectl --context addons-679000 create -f testdata/csi-hostpath-driver/pv-pod.yaml
addons_test.go:579: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod" in namespace "default" ...
helpers_test.go:344: "task-pv-pod" [80f79bf9-9532-4550-918a-4fdd4e32ab44] Pending
helpers_test.go:344: "task-pv-pod" [80f79bf9-9532-4550-918a-4fdd4e32ab44] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])
helpers_test.go:344: "task-pv-pod" [80f79bf9-9532-4550-918a-4fdd4e32ab44] Running
addons_test.go:579: (dbg) TestAddons/parallel/CSI: app=task-pv-pod healthy within 14.002848425s
addons_test.go:584: (dbg) Run:  kubectl --context addons-679000 create -f testdata/csi-hostpath-driver/snapshot.yaml
addons_test.go:589: (dbg) TestAddons/parallel/CSI: waiting 6m0s for volume snapshot "new-snapshot-demo" in namespace "default" ...
helpers_test.go:419: (dbg) Run:  kubectl --context addons-679000 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
helpers_test.go:419: (dbg) Run:  kubectl --context addons-679000 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
addons_test.go:594: (dbg) Run:  kubectl --context addons-679000 delete pod task-pv-pod
addons_test.go:594: (dbg) Done: kubectl --context addons-679000 delete pod task-pv-pod: (1.437379818s)
addons_test.go:600: (dbg) Run:  kubectl --context addons-679000 delete pvc hpvc
addons_test.go:606: (dbg) Run:  kubectl --context addons-679000 create -f testdata/csi-hostpath-driver/pvc-restore.yaml
addons_test.go:611: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc-restore" in namespace "default" ...
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
addons_test.go:616: (dbg) Run:  kubectl --context addons-679000 create -f testdata/csi-hostpath-driver/pv-pod-restore.yaml
addons_test.go:621: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod-restore" in namespace "default" ...
helpers_test.go:344: "task-pv-pod-restore" [6352d7ea-2e4d-4cc3-9c08-7ee3c5c76ed8] Pending
helpers_test.go:344: "task-pv-pod-restore" [6352d7ea-2e4d-4cc3-9c08-7ee3c5c76ed8] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])
helpers_test.go:344: "task-pv-pod-restore" [6352d7ea-2e4d-4cc3-9c08-7ee3c5c76ed8] Running
addons_test.go:621: (dbg) TestAddons/parallel/CSI: app=task-pv-pod-restore healthy within 9.002744126s
addons_test.go:626: (dbg) Run:  kubectl --context addons-679000 delete pod task-pv-pod-restore
addons_test.go:630: (dbg) Run:  kubectl --context addons-679000 delete pvc hpvc-restore
addons_test.go:634: (dbg) Run:  kubectl --context addons-679000 delete volumesnapshot new-snapshot-demo
addons_test.go:638: (dbg) Run:  out/minikube-darwin-amd64 -p addons-679000 addons disable csi-hostpath-driver --alsologtostderr -v=1
addons_test.go:638: (dbg) Done: out/minikube-darwin-amd64 -p addons-679000 addons disable csi-hostpath-driver --alsologtostderr -v=1: (6.525203787s)
addons_test.go:642: (dbg) Run:  out/minikube-darwin-amd64 -p addons-679000 addons disable volumesnapshots --alsologtostderr -v=1
--- PASS: TestAddons/parallel/CSI (58.47s)

                                                
                                    
x
+
TestAddons/parallel/Headlamp (13.16s)

                                                
                                                
=== RUN   TestAddons/parallel/Headlamp
=== PAUSE TestAddons/parallel/Headlamp

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Headlamp
addons_test.go:824: (dbg) Run:  out/minikube-darwin-amd64 addons enable headlamp -p addons-679000 --alsologtostderr -v=1
addons_test.go:824: (dbg) Done: out/minikube-darwin-amd64 addons enable headlamp -p addons-679000 --alsologtostderr -v=1: (1.154857442s)
addons_test.go:829: (dbg) TestAddons/parallel/Headlamp: waiting 8m0s for pods matching "app.kubernetes.io/name=headlamp" in namespace "headlamp" ...
helpers_test.go:344: "headlamp-7ddfbb94ff-5kr9c" [30ce3359-b41a-47c0-9607-68dd44647db9] Pending
helpers_test.go:344: "headlamp-7ddfbb94ff-5kr9c" [30ce3359-b41a-47c0-9607-68dd44647db9] Pending / Ready:ContainersNotReady (containers with unready status: [headlamp]) / ContainersReady:ContainersNotReady (containers with unready status: [headlamp])
helpers_test.go:344: "headlamp-7ddfbb94ff-5kr9c" [30ce3359-b41a-47c0-9607-68dd44647db9] Running / Ready:ContainersNotReady (containers with unready status: [headlamp]) / ContainersReady:ContainersNotReady (containers with unready status: [headlamp])
helpers_test.go:344: "headlamp-7ddfbb94ff-5kr9c" [30ce3359-b41a-47c0-9607-68dd44647db9] Running
addons_test.go:829: (dbg) TestAddons/parallel/Headlamp: app.kubernetes.io/name=headlamp healthy within 12.00417157s
--- PASS: TestAddons/parallel/Headlamp (13.16s)

                                                
                                    
x
+
TestAddons/parallel/CloudSpanner (5.41s)

                                                
                                                
=== RUN   TestAddons/parallel/CloudSpanner
=== PAUSE TestAddons/parallel/CloudSpanner

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CloudSpanner
addons_test.go:857: (dbg) TestAddons/parallel/CloudSpanner: waiting 6m0s for pods matching "app=cloud-spanner-emulator" in namespace "default" ...
helpers_test.go:344: "cloud-spanner-emulator-64c8c85f65-pkn5d" [863a0234-65f4-4ee6-b123-52b24a570c79] Running
addons_test.go:857: (dbg) TestAddons/parallel/CloudSpanner: app=cloud-spanner-emulator healthy within 5.00366537s
addons_test.go:860: (dbg) Run:  out/minikube-darwin-amd64 addons disable cloud-spanner -p addons-679000
--- PASS: TestAddons/parallel/CloudSpanner (5.41s)

                                                
                                    
x
+
TestAddons/parallel/LocalPath (59.4s)

                                                
                                                
=== RUN   TestAddons/parallel/LocalPath
=== PAUSE TestAddons/parallel/LocalPath

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/LocalPath
addons_test.go:873: (dbg) Run:  kubectl --context addons-679000 apply -f testdata/storage-provisioner-rancher/pvc.yaml
addons_test.go:879: (dbg) Run:  kubectl --context addons-679000 apply -f testdata/storage-provisioner-rancher/pod.yaml
addons_test.go:883: (dbg) TestAddons/parallel/LocalPath: waiting 5m0s for pvc "test-pvc" in namespace "default" ...
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-679000 get pvc test-pvc -o jsonpath={.status.phase} -n default
addons_test.go:886: (dbg) TestAddons/parallel/LocalPath: waiting 3m0s for pods matching "run=test-local-path" in namespace "default" ...
helpers_test.go:344: "test-local-path" [ba7712d4-74f0-4280-ba38-71f0caa12728] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:344: "test-local-path" [ba7712d4-74f0-4280-ba38-71f0caa12728] Pending: Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
helpers_test.go:344: "test-local-path" [ba7712d4-74f0-4280-ba38-71f0caa12728] Succeeded: Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
addons_test.go:886: (dbg) TestAddons/parallel/LocalPath: run=test-local-path healthy within 5.003910489s
addons_test.go:891: (dbg) Run:  kubectl --context addons-679000 get pvc test-pvc -o=json
addons_test.go:900: (dbg) Run:  out/minikube-darwin-amd64 -p addons-679000 ssh "cat /opt/local-path-provisioner/pvc-6ba3ac69-66cf-42a3-87a7-8e15de332841_default_test-pvc/file1"
addons_test.go:912: (dbg) Run:  kubectl --context addons-679000 delete pod test-local-path
addons_test.go:916: (dbg) Run:  kubectl --context addons-679000 delete pvc test-pvc
addons_test.go:920: (dbg) Run:  out/minikube-darwin-amd64 -p addons-679000 addons disable storage-provisioner-rancher --alsologtostderr -v=1
addons_test.go:920: (dbg) Done: out/minikube-darwin-amd64 -p addons-679000 addons disable storage-provisioner-rancher --alsologtostderr -v=1: (42.76681683s)
--- PASS: TestAddons/parallel/LocalPath (59.40s)

                                                
                                    
x
+
TestAddons/parallel/NvidiaDevicePlugin (5.35s)

                                                
                                                
=== RUN   TestAddons/parallel/NvidiaDevicePlugin
=== PAUSE TestAddons/parallel/NvidiaDevicePlugin

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/NvidiaDevicePlugin
addons_test.go:952: (dbg) TestAddons/parallel/NvidiaDevicePlugin: waiting 6m0s for pods matching "name=nvidia-device-plugin-ds" in namespace "kube-system" ...
helpers_test.go:344: "nvidia-device-plugin-daemonset-v56p9" [8e6ad308-18df-4fb7-9751-8a90ae67e931] Running
addons_test.go:952: (dbg) TestAddons/parallel/NvidiaDevicePlugin: name=nvidia-device-plugin-ds healthy within 5.00268153s
addons_test.go:955: (dbg) Run:  out/minikube-darwin-amd64 addons disable nvidia-device-plugin -p addons-679000
--- PASS: TestAddons/parallel/NvidiaDevicePlugin (5.35s)

                                                
                                    
x
+
TestAddons/parallel/Yakd (5s)

                                                
                                                
=== RUN   TestAddons/parallel/Yakd
=== PAUSE TestAddons/parallel/Yakd

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Yakd
addons_test.go:963: (dbg) TestAddons/parallel/Yakd: waiting 2m0s for pods matching "app.kubernetes.io/name=yakd-dashboard" in namespace "yakd-dashboard" ...
helpers_test.go:344: "yakd-dashboard-9947fc6bf-fhsm7" [7fc86e42-c192-43e1-9b31-d2294047e3bb] Running
addons_test.go:963: (dbg) TestAddons/parallel/Yakd: app.kubernetes.io/name=yakd-dashboard healthy within 5.002888509s
--- PASS: TestAddons/parallel/Yakd (5.00s)

                                                
                                    
x
+
TestAddons/serial/GCPAuth/Namespaces (0.09s)

                                                
                                                
=== RUN   TestAddons/serial/GCPAuth/Namespaces
addons_test.go:650: (dbg) Run:  kubectl --context addons-679000 create ns new-namespace
addons_test.go:664: (dbg) Run:  kubectl --context addons-679000 get secret gcp-auth -n new-namespace
--- PASS: TestAddons/serial/GCPAuth/Namespaces (0.09s)

                                                
                                    
x
+
TestAddons/StoppedEnableDisable (5.76s)

                                                
                                                
=== RUN   TestAddons/StoppedEnableDisable
addons_test.go:172: (dbg) Run:  out/minikube-darwin-amd64 stop -p addons-679000
addons_test.go:172: (dbg) Done: out/minikube-darwin-amd64 stop -p addons-679000: (5.209035176s)
addons_test.go:176: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p addons-679000
addons_test.go:180: (dbg) Run:  out/minikube-darwin-amd64 addons disable dashboard -p addons-679000
addons_test.go:185: (dbg) Run:  out/minikube-darwin-amd64 addons disable gvisor -p addons-679000
--- PASS: TestAddons/StoppedEnableDisable (5.76s)

                                                
                                    
x
+
TestCertOptions (39.64s)

                                                
                                                
=== RUN   TestCertOptions
=== PAUSE TestCertOptions

                                                
                                                

                                                
                                                
=== CONT  TestCertOptions
cert_options_test.go:49: (dbg) Run:  out/minikube-darwin-amd64 start -p cert-options-085000 --memory=2048 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=hyperkit 
cert_options_test.go:49: (dbg) Done: out/minikube-darwin-amd64 start -p cert-options-085000 --memory=2048 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=hyperkit : (34.00847167s)
cert_options_test.go:60: (dbg) Run:  out/minikube-darwin-amd64 -p cert-options-085000 ssh "openssl x509 -text -noout -in /var/lib/minikube/certs/apiserver.crt"
cert_options_test.go:88: (dbg) Run:  kubectl --context cert-options-085000 config view
cert_options_test.go:100: (dbg) Run:  out/minikube-darwin-amd64 ssh -p cert-options-085000 -- "sudo cat /etc/kubernetes/admin.conf"
helpers_test.go:175: Cleaning up "cert-options-085000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p cert-options-085000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p cert-options-085000: (5.268494628s)
--- PASS: TestCertOptions (39.64s)

                                                
                                    
x
+
TestCertExpiration (241.55s)

                                                
                                                
=== RUN   TestCertExpiration
=== PAUSE TestCertExpiration

                                                
                                                

                                                
                                                
=== CONT  TestCertExpiration
cert_options_test.go:123: (dbg) Run:  out/minikube-darwin-amd64 start -p cert-expiration-789000 --memory=2048 --cert-expiration=3m --driver=hyperkit 
cert_options_test.go:123: (dbg) Done: out/minikube-darwin-amd64 start -p cert-expiration-789000 --memory=2048 --cert-expiration=3m --driver=hyperkit : (34.124181171s)
cert_options_test.go:131: (dbg) Run:  out/minikube-darwin-amd64 start -p cert-expiration-789000 --memory=2048 --cert-expiration=8760h --driver=hyperkit 
E0213 15:30:57.763743    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
cert_options_test.go:131: (dbg) Done: out/minikube-darwin-amd64 start -p cert-expiration-789000 --memory=2048 --cert-expiration=8760h --driver=hyperkit : (22.15352086s)
helpers_test.go:175: Cleaning up "cert-expiration-789000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p cert-expiration-789000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p cert-expiration-789000: (5.274446305s)
--- PASS: TestCertExpiration (241.55s)

                                                
                                    
x
+
TestDockerFlags (38.17s)

                                                
                                                
=== RUN   TestDockerFlags
=== PAUSE TestDockerFlags

                                                
                                                

                                                
                                                
=== CONT  TestDockerFlags
docker_test.go:51: (dbg) Run:  out/minikube-darwin-amd64 start -p docker-flags-065000 --cache-images=false --memory=2048 --install-addons=false --wait=false --docker-env=FOO=BAR --docker-env=BAZ=BAT --docker-opt=debug --docker-opt=icc=true --alsologtostderr -v=5 --driver=hyperkit 
docker_test.go:51: (dbg) Done: out/minikube-darwin-amd64 start -p docker-flags-065000 --cache-images=false --memory=2048 --install-addons=false --wait=false --docker-env=FOO=BAR --docker-env=BAZ=BAT --docker-opt=debug --docker-opt=icc=true --alsologtostderr -v=5 --driver=hyperkit : (34.430247465s)
docker_test.go:56: (dbg) Run:  out/minikube-darwin-amd64 -p docker-flags-065000 ssh "sudo systemctl show docker --property=Environment --no-pager"
docker_test.go:67: (dbg) Run:  out/minikube-darwin-amd64 -p docker-flags-065000 ssh "sudo systemctl show docker --property=ExecStart --no-pager"
helpers_test.go:175: Cleaning up "docker-flags-065000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p docker-flags-065000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p docker-flags-065000: (3.402732827s)
--- PASS: TestDockerFlags (38.17s)

                                                
                                    
x
+
TestForceSystemdFlag (39.82s)

                                                
                                                
=== RUN   TestForceSystemdFlag
=== PAUSE TestForceSystemdFlag

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdFlag
docker_test.go:91: (dbg) Run:  out/minikube-darwin-amd64 start -p force-systemd-flag-359000 --memory=2048 --force-systemd --alsologtostderr -v=5 --driver=hyperkit 
docker_test.go:91: (dbg) Done: out/minikube-darwin-amd64 start -p force-systemd-flag-359000 --memory=2048 --force-systemd --alsologtostderr -v=5 --driver=hyperkit : (34.378275957s)
docker_test.go:110: (dbg) Run:  out/minikube-darwin-amd64 -p force-systemd-flag-359000 ssh "docker info --format {{.CgroupDriver}}"
helpers_test.go:175: Cleaning up "force-systemd-flag-359000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p force-systemd-flag-359000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p force-systemd-flag-359000: (5.268638404s)
--- PASS: TestForceSystemdFlag (39.82s)

                                                
                                    
x
+
TestForceSystemdEnv (156.96s)

                                                
                                                
=== RUN   TestForceSystemdEnv
=== PAUSE TestForceSystemdEnv

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdEnv
docker_test.go:155: (dbg) Run:  out/minikube-darwin-amd64 start -p force-systemd-env-917000 --memory=2048 --alsologtostderr -v=5 --driver=hyperkit 
E0213 15:25:28.685513    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
docker_test.go:155: (dbg) Done: out/minikube-darwin-amd64 start -p force-systemd-env-917000 --memory=2048 --alsologtostderr -v=5 --driver=hyperkit : (2m31.50061295s)
docker_test.go:110: (dbg) Run:  out/minikube-darwin-amd64 -p force-systemd-env-917000 ssh "docker info --format {{.CgroupDriver}}"
helpers_test.go:175: Cleaning up "force-systemd-env-917000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p force-systemd-env-917000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p force-systemd-env-917000: (5.290922341s)
--- PASS: TestForceSystemdEnv (156.96s)

                                                
                                    
x
+
TestHyperKitDriverInstallOrUpdate (8.76s)

                                                
                                                
=== RUN   TestHyperKitDriverInstallOrUpdate
=== PAUSE TestHyperKitDriverInstallOrUpdate

                                                
                                                

                                                
                                                
=== CONT  TestHyperKitDriverInstallOrUpdate
--- PASS: TestHyperKitDriverInstallOrUpdate (8.76s)

                                                
                                    
x
+
TestErrorSpam/setup (34.9s)

                                                
                                                
=== RUN   TestErrorSpam/setup
error_spam_test.go:81: (dbg) Run:  out/minikube-darwin-amd64 start -p nospam-492000 -n=1 --memory=2250 --wait=false --log_dir=/var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-492000 --driver=hyperkit 
error_spam_test.go:81: (dbg) Done: out/minikube-darwin-amd64 start -p nospam-492000 -n=1 --memory=2250 --wait=false --log_dir=/var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-492000 --driver=hyperkit : (34.898458516s)
--- PASS: TestErrorSpam/setup (34.90s)

                                                
                                    
x
+
TestErrorSpam/start (1.51s)

                                                
                                                
=== RUN   TestErrorSpam/start
error_spam_test.go:216: Cleaning up 1 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-492000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-492000 start --dry-run
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-492000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-492000 start --dry-run
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-492000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-492000 start --dry-run
--- PASS: TestErrorSpam/start (1.51s)

                                                
                                    
x
+
TestErrorSpam/status (0.51s)

                                                
                                                
=== RUN   TestErrorSpam/status
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-492000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-492000 status
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-492000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-492000 status
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-492000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-492000 status
--- PASS: TestErrorSpam/status (0.51s)

                                                
                                    
x
+
TestErrorSpam/pause (1.35s)

                                                
                                                
=== RUN   TestErrorSpam/pause
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-492000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-492000 pause
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-492000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-492000 pause
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-492000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-492000 pause
--- PASS: TestErrorSpam/pause (1.35s)

                                                
                                    
x
+
TestErrorSpam/unpause (1.34s)

                                                
                                                
=== RUN   TestErrorSpam/unpause
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-492000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-492000 unpause
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-492000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-492000 unpause
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-492000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-492000 unpause
--- PASS: TestErrorSpam/unpause (1.34s)

                                                
                                    
x
+
TestErrorSpam/stop (5.69s)

                                                
                                                
=== RUN   TestErrorSpam/stop
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-492000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-492000 stop
error_spam_test.go:159: (dbg) Done: out/minikube-darwin-amd64 -p nospam-492000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-492000 stop: (5.236790317s)
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-492000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-492000 stop
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-492000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-492000 stop
--- PASS: TestErrorSpam/stop (5.69s)

                                                
                                    
x
+
TestFunctional/serial/CopySyncFile (0s)

                                                
                                                
=== RUN   TestFunctional/serial/CopySyncFile
functional_test.go:1851: local sync path: /Users/jenkins/minikube-integration/18169-2790/.minikube/files/etc/test/nested/copy/3342/hosts
--- PASS: TestFunctional/serial/CopySyncFile (0.00s)

                                                
                                    
x
+
TestFunctional/serial/StartWithProxy (55.42s)

                                                
                                                
=== RUN   TestFunctional/serial/StartWithProxy
functional_test.go:2230: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-634000 --memory=4000 --apiserver-port=8441 --wait=all --driver=hyperkit 
functional_test.go:2230: (dbg) Done: out/minikube-darwin-amd64 start -p functional-634000 --memory=4000 --apiserver-port=8441 --wait=all --driver=hyperkit : (55.423008185s)
--- PASS: TestFunctional/serial/StartWithProxy (55.42s)

                                                
                                    
x
+
TestFunctional/serial/AuditLog (0s)

                                                
                                                
=== RUN   TestFunctional/serial/AuditLog
--- PASS: TestFunctional/serial/AuditLog (0.00s)

                                                
                                    
x
+
TestFunctional/serial/SoftStart (40.87s)

                                                
                                                
=== RUN   TestFunctional/serial/SoftStart
functional_test.go:655: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-634000 --alsologtostderr -v=8
functional_test.go:655: (dbg) Done: out/minikube-darwin-amd64 start -p functional-634000 --alsologtostderr -v=8: (40.873347657s)
functional_test.go:659: soft start took 40.873846509s for "functional-634000" cluster.
--- PASS: TestFunctional/serial/SoftStart (40.87s)

                                                
                                    
x
+
TestFunctional/serial/KubeContext (0.04s)

                                                
                                                
=== RUN   TestFunctional/serial/KubeContext
functional_test.go:677: (dbg) Run:  kubectl config current-context
--- PASS: TestFunctional/serial/KubeContext (0.04s)

                                                
                                    
x
+
TestFunctional/serial/KubectlGetPods (0.08s)

                                                
                                                
=== RUN   TestFunctional/serial/KubectlGetPods
functional_test.go:692: (dbg) Run:  kubectl --context functional-634000 get po -A
--- PASS: TestFunctional/serial/KubectlGetPods (0.08s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_remote (10.24s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_remote
functional_test.go:1045: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 cache add registry.k8s.io/pause:3.1
functional_test.go:1045: (dbg) Done: out/minikube-darwin-amd64 -p functional-634000 cache add registry.k8s.io/pause:3.1: (4.215726785s)
functional_test.go:1045: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 cache add registry.k8s.io/pause:3.3
functional_test.go:1045: (dbg) Done: out/minikube-darwin-amd64 -p functional-634000 cache add registry.k8s.io/pause:3.3: (3.528494597s)
functional_test.go:1045: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 cache add registry.k8s.io/pause:latest
functional_test.go:1045: (dbg) Done: out/minikube-darwin-amd64 -p functional-634000 cache add registry.k8s.io/pause:latest: (2.493723215s)
--- PASS: TestFunctional/serial/CacheCmd/cache/add_remote (10.24s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_local (1.74s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_local
functional_test.go:1073: (dbg) Run:  docker build -t minikube-local-cache-test:functional-634000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalserialCacheCmdcacheadd_local3213463412/001
functional_test.go:1085: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 cache add minikube-local-cache-test:functional-634000
functional_test.go:1090: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 cache delete minikube-local-cache-test:functional-634000
functional_test.go:1079: (dbg) Run:  docker rmi minikube-local-cache-test:functional-634000
--- PASS: TestFunctional/serial/CacheCmd/cache/add_local (1.74s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/CacheDelete (0.08s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/CacheDelete
functional_test.go:1098: (dbg) Run:  out/minikube-darwin-amd64 cache delete registry.k8s.io/pause:3.3
--- PASS: TestFunctional/serial/CacheCmd/cache/CacheDelete (0.08s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/list (0.08s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/list
functional_test.go:1106: (dbg) Run:  out/minikube-darwin-amd64 cache list
--- PASS: TestFunctional/serial/CacheCmd/cache/list (0.08s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.2s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node
functional_test.go:1120: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh sudo crictl images
--- PASS: TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.20s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/cache_reload (2.56s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/cache_reload
functional_test.go:1143: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh sudo docker rmi registry.k8s.io/pause:latest
E0213 14:59:03.879602    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
E0213 14:59:03.885713    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
E0213 14:59:03.895846    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
E0213 14:59:03.916994    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
E0213 14:59:03.958289    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
functional_test.go:1149: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh sudo crictl inspecti registry.k8s.io/pause:latest
E0213 14:59:04.039600    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
functional_test.go:1149: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-634000 ssh sudo crictl inspecti registry.k8s.io/pause:latest: exit status 1 (157.573049ms)

                                                
                                                
-- stdout --
	FATA[0000] no such image "registry.k8s.io/pause:latest" present 

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:1154: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 cache reload
E0213 14:59:04.199870    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
E0213 14:59:04.520085    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
E0213 14:59:05.160414    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
functional_test.go:1154: (dbg) Done: out/minikube-darwin-amd64 -p functional-634000 cache reload: (2.057016377s)
functional_test.go:1159: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh sudo crictl inspecti registry.k8s.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/cache_reload (2.56s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/delete (0.16s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/delete
functional_test.go:1168: (dbg) Run:  out/minikube-darwin-amd64 cache delete registry.k8s.io/pause:3.1
E0213 14:59:06.442333    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
functional_test.go:1168: (dbg) Run:  out/minikube-darwin-amd64 cache delete registry.k8s.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/delete (0.16s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmd (1.12s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmd
functional_test.go:712: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 kubectl -- --context functional-634000 get pods
functional_test.go:712: (dbg) Done: out/minikube-darwin-amd64 -p functional-634000 kubectl -- --context functional-634000 get pods: (1.121384081s)
--- PASS: TestFunctional/serial/MinikubeKubectlCmd (1.12s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmdDirectly (1.53s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmdDirectly
functional_test.go:737: (dbg) Run:  out/kubectl --context functional-634000 get pods
E0213 14:59:09.003854    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
functional_test.go:737: (dbg) Done: out/kubectl --context functional-634000 get pods: (1.527280172s)
--- PASS: TestFunctional/serial/MinikubeKubectlCmdDirectly (1.53s)

                                                
                                    
x
+
TestFunctional/serial/ExtraConfig (35.58s)

                                                
                                                
=== RUN   TestFunctional/serial/ExtraConfig
functional_test.go:753: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-634000 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all
E0213 14:59:14.124763    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
E0213 14:59:24.365932    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
functional_test.go:753: (dbg) Done: out/minikube-darwin-amd64 start -p functional-634000 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all: (35.579321475s)
functional_test.go:757: restart took 35.579468639s for "functional-634000" cluster.
--- PASS: TestFunctional/serial/ExtraConfig (35.58s)

                                                
                                    
x
+
TestFunctional/serial/ComponentHealth (0.08s)

                                                
                                                
=== RUN   TestFunctional/serial/ComponentHealth
functional_test.go:806: (dbg) Run:  kubectl --context functional-634000 get po -l tier=control-plane -n kube-system -o=json
functional_test.go:821: etcd phase: Running
functional_test.go:831: etcd status: Ready
functional_test.go:821: kube-apiserver phase: Running
functional_test.go:831: kube-apiserver status: Ready
functional_test.go:821: kube-controller-manager phase: Running
functional_test.go:831: kube-controller-manager status: Ready
functional_test.go:821: kube-scheduler phase: Running
functional_test.go:831: kube-scheduler status: Ready
--- PASS: TestFunctional/serial/ComponentHealth (0.08s)

                                                
                                    
x
+
TestFunctional/serial/LogsCmd (3.25s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsCmd
functional_test.go:1232: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 logs
E0213 14:59:44.845571    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
functional_test.go:1232: (dbg) Done: out/minikube-darwin-amd64 -p functional-634000 logs: (3.250053748s)
--- PASS: TestFunctional/serial/LogsCmd (3.25s)

                                                
                                    
x
+
TestFunctional/serial/LogsFileCmd (3.31s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsFileCmd
functional_test.go:1246: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 logs --file /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalserialLogsFileCmd1070508363/001/logs.txt
functional_test.go:1246: (dbg) Done: out/minikube-darwin-amd64 -p functional-634000 logs --file /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalserialLogsFileCmd1070508363/001/logs.txt: (3.307096761s)
--- PASS: TestFunctional/serial/LogsFileCmd (3.31s)

                                                
                                    
x
+
TestFunctional/serial/InvalidService (4.25s)

                                                
                                                
=== RUN   TestFunctional/serial/InvalidService
functional_test.go:2317: (dbg) Run:  kubectl --context functional-634000 apply -f testdata/invalidsvc.yaml
functional_test.go:2331: (dbg) Run:  out/minikube-darwin-amd64 service invalid-svc -p functional-634000
functional_test.go:2331: (dbg) Non-zero exit: out/minikube-darwin-amd64 service invalid-svc -p functional-634000: exit status 115 (271.364661ms)

                                                
                                                
-- stdout --
	|-----------|-------------|-------------|--------------------------|
	| NAMESPACE |    NAME     | TARGET PORT |           URL            |
	|-----------|-------------|-------------|--------------------------|
	| default   | invalid-svc |          80 | http://192.169.0.5:31804 |
	|-----------|-------------|-------------|--------------------------|
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to SVC_UNREACHABLE: service not available: no running pod for service invalid-svc found
	* 
	╭────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                                                            │
	│    * If the above advice does not help, please let us know:                                                                │
	│      https://github.com/kubernetes/minikube/issues/new/choose                                                              │
	│                                                                                                                            │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.                                   │
	│    * Please also attach the following file to the GitHub issue:                                                            │
	│    * - /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube_service_96b204199e3191fa1740d4430b018a3c8028d52d_0.log    │
	│                                                                                                                            │
	╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
functional_test.go:2323: (dbg) Run:  kubectl --context functional-634000 delete -f testdata/invalidsvc.yaml
--- PASS: TestFunctional/serial/InvalidService (4.25s)

                                                
                                    
x
+
TestFunctional/parallel/ConfigCmd (0.51s)

                                                
                                                
=== RUN   TestFunctional/parallel/ConfigCmd
=== PAUSE TestFunctional/parallel/ConfigCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1195: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 config unset cpus
functional_test.go:1195: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 config get cpus
functional_test.go:1195: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-634000 config get cpus: exit status 14 (73.26803ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
functional_test.go:1195: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 config set cpus 2
functional_test.go:1195: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 config get cpus
functional_test.go:1195: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 config unset cpus
functional_test.go:1195: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 config get cpus
functional_test.go:1195: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-634000 config get cpus: exit status 14 (56.857868ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/ConfigCmd (0.51s)

                                                
                                    
x
+
TestFunctional/parallel/DashboardCmd (10.43s)

                                                
                                                
=== RUN   TestFunctional/parallel/DashboardCmd
=== PAUSE TestFunctional/parallel/DashboardCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DashboardCmd
functional_test.go:901: (dbg) daemon: [out/minikube-darwin-amd64 dashboard --url --port 36195 -p functional-634000 --alsologtostderr -v=1]
functional_test.go:906: (dbg) stopping [out/minikube-darwin-amd64 dashboard --url --port 36195 -p functional-634000 --alsologtostderr -v=1] ...
helpers_test.go:508: unable to kill pid 5065: os: process already finished
--- PASS: TestFunctional/parallel/DashboardCmd (10.43s)

                                                
                                    
x
+
TestFunctional/parallel/DryRun (0.93s)

                                                
                                                
=== RUN   TestFunctional/parallel/DryRun
=== PAUSE TestFunctional/parallel/DryRun

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DryRun
functional_test.go:970: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-634000 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit 
functional_test.go:970: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p functional-634000 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit : exit status 23 (502.460089ms)

                                                
                                                
-- stdout --
	* [functional-634000] minikube v1.32.0 on Darwin 14.3.1
	  - MINIKUBE_LOCATION=18169
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18169-2790/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18169-2790/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on existing profile
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0213 15:01:03.768893    5030 out.go:291] Setting OutFile to fd 1 ...
	I0213 15:01:03.769152    5030 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0213 15:01:03.769157    5030 out.go:304] Setting ErrFile to fd 2...
	I0213 15:01:03.769162    5030 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0213 15:01:03.769334    5030 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18169-2790/.minikube/bin
	I0213 15:01:03.770788    5030 out.go:298] Setting JSON to false
	I0213 15:01:03.798442    5030 start.go:128] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":1437,"bootTime":1707863826,"procs":496,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.3.1","kernelVersion":"23.3.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0213 15:01:03.798541    5030 start.go:136] gopshost.Virtualization returned error: not implemented yet
	I0213 15:01:03.821065    5030 out.go:177] * [functional-634000] minikube v1.32.0 on Darwin 14.3.1
	I0213 15:01:03.862938    5030 out.go:177]   - MINIKUBE_LOCATION=18169
	I0213 15:01:03.862969    5030 notify.go:220] Checking for updates...
	I0213 15:01:03.905629    5030 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/18169-2790/kubeconfig
	I0213 15:01:03.926731    5030 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0213 15:01:03.947779    5030 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0213 15:01:03.969111    5030 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18169-2790/.minikube
	I0213 15:01:03.989785    5030 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0213 15:01:04.011591    5030 config.go:182] Loaded profile config "functional-634000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
	I0213 15:01:04.012290    5030 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 15:01:04.012404    5030 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 15:01:04.021306    5030 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51209
	I0213 15:01:04.021705    5030 main.go:141] libmachine: () Calling .GetVersion
	I0213 15:01:04.022145    5030 main.go:141] libmachine: Using API Version  1
	I0213 15:01:04.022155    5030 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 15:01:04.022417    5030 main.go:141] libmachine: () Calling .GetMachineName
	I0213 15:01:04.022531    5030 main.go:141] libmachine: (functional-634000) Calling .DriverName
	I0213 15:01:04.022731    5030 driver.go:392] Setting default libvirt URI to qemu:///system
	I0213 15:01:04.022989    5030 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 15:01:04.023014    5030 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 15:01:04.031039    5030 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51211
	I0213 15:01:04.031377    5030 main.go:141] libmachine: () Calling .GetVersion
	I0213 15:01:04.031723    5030 main.go:141] libmachine: Using API Version  1
	I0213 15:01:04.031737    5030 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 15:01:04.031921    5030 main.go:141] libmachine: () Calling .GetMachineName
	I0213 15:01:04.032020    5030 main.go:141] libmachine: (functional-634000) Calling .DriverName
	I0213 15:01:04.060959    5030 out.go:177] * Using the hyperkit driver based on existing profile
	I0213 15:01:04.102784    5030 start.go:298] selected driver: hyperkit
	I0213 15:01:04.102798    5030 start.go:902] validating driver "hyperkit" against &{Name:functional-634000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/17866/minikube-v1.32.1-1703784139-17866-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1704759386-17866@sha256:8c3c33047f9bc285e1f5f2a5aa14744a2fe04c58478f02f77b06169dea8dd3f0 Memory:4000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 Kubernet
esConfig:{KubernetesVersion:v1.28.4 ClusterName:functional-634000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8441 NodeName:} Nodes:[{Name: IP:192.169.0.5 Port:8441 KubernetesVersion:v1.28.4 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisk
s:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs:}
	I0213 15:01:04.102903    5030 start.go:913] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0213 15:01:04.128797    5030 out.go:177] 
	W0213 15:01:04.149841    5030 out.go:239] X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	I0213 15:01:04.170864    5030 out.go:177] 

                                                
                                                
** /stderr **
functional_test.go:987: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-634000 --dry-run --alsologtostderr -v=1 --driver=hyperkit 
--- PASS: TestFunctional/parallel/DryRun (0.93s)

                                                
                                    
x
+
TestFunctional/parallel/InternationalLanguage (0.47s)

                                                
                                                
=== RUN   TestFunctional/parallel/InternationalLanguage
=== PAUSE TestFunctional/parallel/InternationalLanguage

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/InternationalLanguage
functional_test.go:1016: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-634000 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit 
functional_test.go:1016: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p functional-634000 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit : exit status 23 (465.467928ms)

                                                
                                                
-- stdout --
	* [functional-634000] minikube v1.32.0 sur Darwin 14.3.1
	  - MINIKUBE_LOCATION=18169
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18169-2790/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18169-2790/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Utilisation du pilote hyperkit basé sur le profil existant
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0213 15:01:04.692198    5046 out.go:291] Setting OutFile to fd 1 ...
	I0213 15:01:04.692352    5046 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0213 15:01:04.692357    5046 out.go:304] Setting ErrFile to fd 2...
	I0213 15:01:04.692361    5046 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0213 15:01:04.692564    5046 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18169-2790/.minikube/bin
	I0213 15:01:04.694165    5046 out.go:298] Setting JSON to false
	I0213 15:01:04.717633    5046 start.go:128] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":1438,"bootTime":1707863826,"procs":500,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.3.1","kernelVersion":"23.3.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0213 15:01:04.717730    5046 start.go:136] gopshost.Virtualization returned error: not implemented yet
	I0213 15:01:04.739065    5046 out.go:177] * [functional-634000] minikube v1.32.0 sur Darwin 14.3.1
	I0213 15:01:04.780815    5046 out.go:177]   - MINIKUBE_LOCATION=18169
	I0213 15:01:04.780910    5046 notify.go:220] Checking for updates...
	I0213 15:01:04.801902    5046 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/18169-2790/kubeconfig
	I0213 15:01:04.822765    5046 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0213 15:01:04.843679    5046 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0213 15:01:04.864793    5046 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18169-2790/.minikube
	I0213 15:01:04.885948    5046 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0213 15:01:04.907432    5046 config.go:182] Loaded profile config "functional-634000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
	I0213 15:01:04.907942    5046 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 15:01:04.908011    5046 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 15:01:04.916850    5046 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51219
	I0213 15:01:04.917227    5046 main.go:141] libmachine: () Calling .GetVersion
	I0213 15:01:04.917655    5046 main.go:141] libmachine: Using API Version  1
	I0213 15:01:04.917685    5046 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 15:01:04.917908    5046 main.go:141] libmachine: () Calling .GetMachineName
	I0213 15:01:04.918014    5046 main.go:141] libmachine: (functional-634000) Calling .DriverName
	I0213 15:01:04.918202    5046 driver.go:392] Setting default libvirt URI to qemu:///system
	I0213 15:01:04.918468    5046 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 15:01:04.918491    5046 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 15:01:04.926431    5046 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51221
	I0213 15:01:04.926771    5046 main.go:141] libmachine: () Calling .GetVersion
	I0213 15:01:04.927137    5046 main.go:141] libmachine: Using API Version  1
	I0213 15:01:04.927152    5046 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 15:01:04.927428    5046 main.go:141] libmachine: () Calling .GetMachineName
	I0213 15:01:04.927538    5046 main.go:141] libmachine: (functional-634000) Calling .DriverName
	I0213 15:01:04.955737    5046 out.go:177] * Utilisation du pilote hyperkit basé sur le profil existant
	I0213 15:01:04.997639    5046 start.go:298] selected driver: hyperkit
	I0213 15:01:04.997662    5046 start.go:902] validating driver "hyperkit" against &{Name:functional-634000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/17866/minikube-v1.32.1-1703784139-17866-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1704759386-17866@sha256:8c3c33047f9bc285e1f5f2a5aa14744a2fe04c58478f02f77b06169dea8dd3f0 Memory:4000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 Kubernet
esConfig:{KubernetesVersion:v1.28.4 ClusterName:functional-634000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8441 NodeName:} Nodes:[{Name: IP:192.169.0.5 Port:8441 KubernetesVersion:v1.28.4 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisk
s:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs:}
	I0213 15:01:04.997858    5046 start.go:913] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0213 15:01:05.024876    5046 out.go:177] 
	W0213 15:01:05.045705    5046 out.go:239] X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	I0213 15:01:05.066811    5046 out.go:177] 

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/InternationalLanguage (0.47s)

                                                
                                    
x
+
TestFunctional/parallel/StatusCmd (0.54s)

                                                
                                                
=== RUN   TestFunctional/parallel/StatusCmd
=== PAUSE TestFunctional/parallel/StatusCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/StatusCmd
functional_test.go:850: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 status
functional_test.go:856: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}
functional_test.go:868: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 status -o json
--- PASS: TestFunctional/parallel/StatusCmd (0.54s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmdConnect (7.57s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmdConnect
=== PAUSE TestFunctional/parallel/ServiceCmdConnect

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmdConnect
functional_test.go:1625: (dbg) Run:  kubectl --context functional-634000 create deployment hello-node-connect --image=registry.k8s.io/echoserver:1.8
functional_test.go:1631: (dbg) Run:  kubectl --context functional-634000 expose deployment hello-node-connect --type=NodePort --port=8080
functional_test.go:1636: (dbg) TestFunctional/parallel/ServiceCmdConnect: waiting 10m0s for pods matching "app=hello-node-connect" in namespace "default" ...
helpers_test.go:344: "hello-node-connect-55497b8b78-nnmw7" [667ea665-a4f4-4495-ac46-71666c581c72] Pending / Ready:ContainersNotReady (containers with unready status: [echoserver]) / ContainersReady:ContainersNotReady (containers with unready status: [echoserver])
helpers_test.go:344: "hello-node-connect-55497b8b78-nnmw7" [667ea665-a4f4-4495-ac46-71666c581c72] Running
functional_test.go:1636: (dbg) TestFunctional/parallel/ServiceCmdConnect: app=hello-node-connect healthy within 7.00421565s
functional_test.go:1645: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 service hello-node-connect --url
functional_test.go:1651: found endpoint for hello-node-connect: http://192.169.0.5:30591
functional_test.go:1671: http://192.169.0.5:30591: success! body:

                                                
                                                

                                                
                                                
Hostname: hello-node-connect-55497b8b78-nnmw7

                                                
                                                
Pod Information:
	-no pod information available-

                                                
                                                
Server values:
	server_version=nginx: 1.13.3 - lua: 10008

                                                
                                                
Request Information:
	client_address=10.244.0.1
	method=GET
	real path=/
	query=
	request_version=1.1
	request_uri=http://192.169.0.5:8080/

                                                
                                                
Request Headers:
	accept-encoding=gzip
	host=192.169.0.5:30591
	user-agent=Go-http-client/1.1

                                                
                                                
Request Body:
	-no body in request-

                                                
                                                
--- PASS: TestFunctional/parallel/ServiceCmdConnect (7.57s)

                                                
                                    
x
+
TestFunctional/parallel/AddonsCmd (0.26s)

                                                
                                                
=== RUN   TestFunctional/parallel/AddonsCmd
=== PAUSE TestFunctional/parallel/AddonsCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/AddonsCmd
functional_test.go:1686: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 addons list
functional_test.go:1698: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 addons list -o json
--- PASS: TestFunctional/parallel/AddonsCmd (0.26s)

                                                
                                    
x
+
TestFunctional/parallel/PersistentVolumeClaim (30.01s)

                                                
                                                
=== RUN   TestFunctional/parallel/PersistentVolumeClaim
=== PAUSE TestFunctional/parallel/PersistentVolumeClaim

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:44: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "integration-test=storage-provisioner" in namespace "kube-system" ...
helpers_test.go:344: "storage-provisioner" [6844a949-ac44-419b-bdab-762275e0a645] Running
functional_test_pvc_test.go:44: (dbg) TestFunctional/parallel/PersistentVolumeClaim: integration-test=storage-provisioner healthy within 5.005257537s
functional_test_pvc_test.go:49: (dbg) Run:  kubectl --context functional-634000 get storageclass -o=json
functional_test_pvc_test.go:69: (dbg) Run:  kubectl --context functional-634000 apply -f testdata/storage-provisioner/pvc.yaml
functional_test_pvc_test.go:76: (dbg) Run:  kubectl --context functional-634000 get pvc myclaim -o=json
functional_test_pvc_test.go:125: (dbg) Run:  kubectl --context functional-634000 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 3m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:344: "sp-pod" [7ee08737-6931-44a4-a1f7-79f6b74d60f4] Pending
helpers_test.go:344: "sp-pod" [7ee08737-6931-44a4-a1f7-79f6b74d60f4] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])
helpers_test.go:344: "sp-pod" [7ee08737-6931-44a4-a1f7-79f6b74d60f4] Running
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 15.003259017s
functional_test_pvc_test.go:100: (dbg) Run:  kubectl --context functional-634000 exec sp-pod -- touch /tmp/mount/foo
functional_test_pvc_test.go:106: (dbg) Run:  kubectl --context functional-634000 delete -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:106: (dbg) Done: kubectl --context functional-634000 delete -f testdata/storage-provisioner/pod.yaml: (1.42158089s)
functional_test_pvc_test.go:125: (dbg) Run:  kubectl --context functional-634000 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 3m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:344: "sp-pod" [9e3fce93-1489-48da-a955-79a69922a748] Pending
helpers_test.go:344: "sp-pod" [9e3fce93-1489-48da-a955-79a69922a748] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])
helpers_test.go:344: "sp-pod" [9e3fce93-1489-48da-a955-79a69922a748] Running
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 8.00457764s
functional_test_pvc_test.go:114: (dbg) Run:  kubectl --context functional-634000 exec sp-pod -- ls /tmp/mount
--- PASS: TestFunctional/parallel/PersistentVolumeClaim (30.01s)

                                                
                                    
x
+
TestFunctional/parallel/SSHCmd (0.29s)

                                                
                                                
=== RUN   TestFunctional/parallel/SSHCmd
=== PAUSE TestFunctional/parallel/SSHCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/SSHCmd
functional_test.go:1721: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh "echo hello"
functional_test.go:1738: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh "cat /etc/hostname"
--- PASS: TestFunctional/parallel/SSHCmd (0.29s)

                                                
                                    
x
+
TestFunctional/parallel/CpCmd (0.93s)

                                                
                                                
=== RUN   TestFunctional/parallel/CpCmd
=== PAUSE TestFunctional/parallel/CpCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CpCmd
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 cp testdata/cp-test.txt /home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh -n functional-634000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 cp functional-634000:/home/docker/cp-test.txt /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelCpCmd739638251/001/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh -n functional-634000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 cp testdata/cp-test.txt /tmp/does/not/exist/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh -n functional-634000 "sudo cat /tmp/does/not/exist/cp-test.txt"
--- PASS: TestFunctional/parallel/CpCmd (0.93s)

                                                
                                    
x
+
TestFunctional/parallel/MySQL (28.93s)

                                                
                                                
=== RUN   TestFunctional/parallel/MySQL
=== PAUSE TestFunctional/parallel/MySQL

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/MySQL
functional_test.go:1789: (dbg) Run:  kubectl --context functional-634000 replace --force -f testdata/mysql.yaml
functional_test.go:1795: (dbg) TestFunctional/parallel/MySQL: waiting 10m0s for pods matching "app=mysql" in namespace "default" ...
helpers_test.go:344: "mysql-859648c796-xfcwr" [37c39019-44e4-4e2c-8c3e-2b4ebc630983] Pending / Ready:ContainersNotReady (containers with unready status: [mysql]) / ContainersReady:ContainersNotReady (containers with unready status: [mysql])
helpers_test.go:344: "mysql-859648c796-xfcwr" [37c39019-44e4-4e2c-8c3e-2b4ebc630983] Running
functional_test.go:1795: (dbg) TestFunctional/parallel/MySQL: app=mysql healthy within 25.002513931s
functional_test.go:1803: (dbg) Run:  kubectl --context functional-634000 exec mysql-859648c796-xfcwr -- mysql -ppassword -e "show databases;"
functional_test.go:1803: (dbg) Non-zero exit: kubectl --context functional-634000 exec mysql-859648c796-xfcwr -- mysql -ppassword -e "show databases;": exit status 1 (166.702066ms)

                                                
                                                
** stderr ** 
	mysql: [Warning] Using a password on the command line interface can be insecure.
	ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)
	command terminated with exit code 1

                                                
                                                
** /stderr **
E0213 15:00:25.804799    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
functional_test.go:1803: (dbg) Run:  kubectl --context functional-634000 exec mysql-859648c796-xfcwr -- mysql -ppassword -e "show databases;"
functional_test.go:1803: (dbg) Non-zero exit: kubectl --context functional-634000 exec mysql-859648c796-xfcwr -- mysql -ppassword -e "show databases;": exit status 1 (222.263108ms)

                                                
                                                
** stderr ** 
	mysql: [Warning] Using a password on the command line interface can be insecure.
	ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)
	command terminated with exit code 1

                                                
                                                
** /stderr **
functional_test.go:1803: (dbg) Run:  kubectl --context functional-634000 exec mysql-859648c796-xfcwr -- mysql -ppassword -e "show databases;"
--- PASS: TestFunctional/parallel/MySQL (28.93s)

                                                
                                    
x
+
TestFunctional/parallel/FileSync (0.17s)

                                                
                                                
=== RUN   TestFunctional/parallel/FileSync
=== PAUSE TestFunctional/parallel/FileSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/FileSync
functional_test.go:1925: Checking for existence of /etc/test/nested/copy/3342/hosts within VM
functional_test.go:1927: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh "sudo cat /etc/test/nested/copy/3342/hosts"
functional_test.go:1932: file sync test content: Test file for checking file sync process
--- PASS: TestFunctional/parallel/FileSync (0.17s)

                                                
                                    
x
+
TestFunctional/parallel/CertSync (1.07s)

                                                
                                                
=== RUN   TestFunctional/parallel/CertSync
=== PAUSE TestFunctional/parallel/CertSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CertSync
functional_test.go:1968: Checking for existence of /etc/ssl/certs/3342.pem within VM
functional_test.go:1969: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh "sudo cat /etc/ssl/certs/3342.pem"
functional_test.go:1968: Checking for existence of /usr/share/ca-certificates/3342.pem within VM
functional_test.go:1969: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh "sudo cat /usr/share/ca-certificates/3342.pem"
functional_test.go:1968: Checking for existence of /etc/ssl/certs/51391683.0 within VM
functional_test.go:1969: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh "sudo cat /etc/ssl/certs/51391683.0"
functional_test.go:1995: Checking for existence of /etc/ssl/certs/33422.pem within VM
functional_test.go:1996: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh "sudo cat /etc/ssl/certs/33422.pem"
functional_test.go:1995: Checking for existence of /usr/share/ca-certificates/33422.pem within VM
functional_test.go:1996: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh "sudo cat /usr/share/ca-certificates/33422.pem"
functional_test.go:1995: Checking for existence of /etc/ssl/certs/3ec20f2e.0 within VM
functional_test.go:1996: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh "sudo cat /etc/ssl/certs/3ec20f2e.0"
--- PASS: TestFunctional/parallel/CertSync (1.07s)

                                                
                                    
x
+
TestFunctional/parallel/NodeLabels (0.09s)

                                                
                                                
=== RUN   TestFunctional/parallel/NodeLabels
=== PAUSE TestFunctional/parallel/NodeLabels

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NodeLabels
functional_test.go:218: (dbg) Run:  kubectl --context functional-634000 get nodes --output=go-template "--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'"
--- PASS: TestFunctional/parallel/NodeLabels (0.09s)

                                                
                                    
x
+
TestFunctional/parallel/NonActiveRuntimeDisabled (0.13s)

                                                
                                                
=== RUN   TestFunctional/parallel/NonActiveRuntimeDisabled
=== PAUSE TestFunctional/parallel/NonActiveRuntimeDisabled

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NonActiveRuntimeDisabled
functional_test.go:2023: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh "sudo systemctl is-active crio"
functional_test.go:2023: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-634000 ssh "sudo systemctl is-active crio": exit status 1 (127.470634ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/NonActiveRuntimeDisabled (0.13s)

                                                
                                    
x
+
TestFunctional/parallel/License (1.54s)

                                                
                                                
=== RUN   TestFunctional/parallel/License
=== PAUSE TestFunctional/parallel/License

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/License
functional_test.go:2284: (dbg) Run:  out/minikube-darwin-amd64 license
functional_test.go:2284: (dbg) Done: out/minikube-darwin-amd64 license: (1.535824198s)
--- PASS: TestFunctional/parallel/License (1.54s)

                                                
                                    
x
+
TestFunctional/parallel/Version/short (0.1s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/short
=== PAUSE TestFunctional/parallel/Version/short

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/short
functional_test.go:2252: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 version --short
--- PASS: TestFunctional/parallel/Version/short (0.10s)

                                                
                                    
x
+
TestFunctional/parallel/Version/components (0.37s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/components
=== PAUSE TestFunctional/parallel/Version/components

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/components
functional_test.go:2266: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 version -o=json --components
--- PASS: TestFunctional/parallel/Version/components (0.37s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListShort (0.19s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListShort
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListShort

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListShort
functional_test.go:260: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 image ls --format short --alsologtostderr
functional_test.go:265: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-634000 image ls --format short --alsologtostderr:
registry.k8s.io/pause:latest
registry.k8s.io/pause:3.9
registry.k8s.io/pause:3.3
registry.k8s.io/pause:3.1
registry.k8s.io/kube-scheduler:v1.28.4
registry.k8s.io/kube-proxy:v1.28.4
registry.k8s.io/kube-controller-manager:v1.28.4
registry.k8s.io/kube-apiserver:v1.28.4
registry.k8s.io/etcd:3.5.9-0
registry.k8s.io/echoserver:1.8
registry.k8s.io/coredns/coredns:v1.10.1
gcr.io/k8s-minikube/storage-provisioner:v5
gcr.io/k8s-minikube/busybox:1.28.4-glibc
gcr.io/google-containers/addon-resizer:functional-634000
docker.io/library/nginx:latest
docker.io/library/nginx:alpine
docker.io/library/mysql:5.7
docker.io/library/minikube-local-cache-test:functional-634000
functional_test.go:268: (dbg) Stderr: out/minikube-darwin-amd64 -p functional-634000 image ls --format short --alsologtostderr:
I0213 15:01:09.860975    5085 out.go:291] Setting OutFile to fd 1 ...
I0213 15:01:09.861307    5085 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0213 15:01:09.861314    5085 out.go:304] Setting ErrFile to fd 2...
I0213 15:01:09.861318    5085 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0213 15:01:09.861507    5085 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18169-2790/.minikube/bin
I0213 15:01:09.862168    5085 config.go:182] Loaded profile config "functional-634000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
I0213 15:01:09.862275    5085 config.go:182] Loaded profile config "functional-634000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
I0213 15:01:09.862662    5085 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0213 15:01:09.862713    5085 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0213 15:01:09.870814    5085 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51279
I0213 15:01:09.871245    5085 main.go:141] libmachine: () Calling .GetVersion
I0213 15:01:09.871686    5085 main.go:141] libmachine: Using API Version  1
I0213 15:01:09.871713    5085 main.go:141] libmachine: () Calling .SetConfigRaw
I0213 15:01:09.871952    5085 main.go:141] libmachine: () Calling .GetMachineName
I0213 15:01:09.872067    5085 main.go:141] libmachine: (functional-634000) Calling .GetState
I0213 15:01:09.872152    5085 main.go:141] libmachine: (functional-634000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0213 15:01:09.872214    5085 main.go:141] libmachine: (functional-634000) DBG | hyperkit pid from json: 4213
I0213 15:01:09.873562    5085 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0213 15:01:09.873585    5085 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0213 15:01:09.881778    5085 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51281
I0213 15:01:09.882195    5085 main.go:141] libmachine: () Calling .GetVersion
I0213 15:01:09.882556    5085 main.go:141] libmachine: Using API Version  1
I0213 15:01:09.882567    5085 main.go:141] libmachine: () Calling .SetConfigRaw
I0213 15:01:09.882763    5085 main.go:141] libmachine: () Calling .GetMachineName
I0213 15:01:09.882870    5085 main.go:141] libmachine: (functional-634000) Calling .DriverName
I0213 15:01:09.883077    5085 ssh_runner.go:195] Run: systemctl --version
I0213 15:01:09.883099    5085 main.go:141] libmachine: (functional-634000) Calling .GetSSHHostname
I0213 15:01:09.883187    5085 main.go:141] libmachine: (functional-634000) Calling .GetSSHPort
I0213 15:01:09.883277    5085 main.go:141] libmachine: (functional-634000) Calling .GetSSHKeyPath
I0213 15:01:09.883393    5085 main.go:141] libmachine: (functional-634000) Calling .GetSSHUsername
I0213 15:01:09.883491    5085 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/functional-634000/id_rsa Username:docker}
I0213 15:01:09.928651    5085 ssh_runner.go:195] Run: docker images --no-trunc --format "{{json .}}"
I0213 15:01:09.961896    5085 main.go:141] libmachine: Making call to close driver server
I0213 15:01:09.961907    5085 main.go:141] libmachine: (functional-634000) Calling .Close
I0213 15:01:09.962074    5085 main.go:141] libmachine: Successfully made call to close driver server
I0213 15:01:09.962084    5085 main.go:141] libmachine: Making call to close connection to plugin binary
I0213 15:01:09.962089    5085 main.go:141] libmachine: (functional-634000) DBG | Closing plugin on server side
I0213 15:01:09.962091    5085 main.go:141] libmachine: Making call to close driver server
I0213 15:01:09.962178    5085 main.go:141] libmachine: (functional-634000) Calling .Close
I0213 15:01:09.962348    5085 main.go:141] libmachine: (functional-634000) DBG | Closing plugin on server side
I0213 15:01:09.962377    5085 main.go:141] libmachine: Successfully made call to close driver server
I0213 15:01:09.962394    5085 main.go:141] libmachine: Making call to close connection to plugin binary
--- PASS: TestFunctional/parallel/ImageCommands/ImageListShort (0.19s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListTable (0.16s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListTable
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListTable

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListTable
functional_test.go:260: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 image ls --format table --alsologtostderr
functional_test.go:265: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-634000 image ls --format table --alsologtostderr:
|---------------------------------------------|-------------------|---------------|--------|
|                    Image                    |        Tag        |   Image ID    |  Size  |
|---------------------------------------------|-------------------|---------------|--------|
| registry.k8s.io/kube-controller-manager     | v1.28.4           | d058aa5ab969c | 122MB  |
| registry.k8s.io/kube-scheduler              | v1.28.4           | e3db313c6dbc0 | 60.1MB |
| docker.io/library/nginx                     | alpine            | 2b70e4aaac6b5 | 42.6MB |
| docker.io/kubernetesui/dashboard            | <none>            | 07655ddf2eebe | 246MB  |
| registry.k8s.io/pause                       | latest            | 350b164e7ae1d | 240kB  |
| docker.io/library/minikube-local-cache-test | functional-634000 | 8df0b0e72a01e | 30B    |
| docker.io/library/mysql                     | 5.7               | 5107333e08a87 | 501MB  |
| registry.k8s.io/kube-apiserver              | v1.28.4           | 7fe0e6f37db33 | 126MB  |
| registry.k8s.io/etcd                        | 3.5.9-0           | 73deb9a3f7025 | 294MB  |
| gcr.io/k8s-minikube/storage-provisioner     | v5                | 6e38f40d628db | 31.5MB |
| registry.k8s.io/pause                       | 3.1               | da86e6ba6ca19 | 742kB  |
| registry.k8s.io/kube-proxy                  | v1.28.4           | 83f6cc407eed8 | 73.2MB |
| registry.k8s.io/coredns/coredns             | v1.10.1           | ead0a4a53df89 | 53.6MB |
| registry.k8s.io/pause                       | 3.9               | e6f1816883972 | 744kB  |
| gcr.io/google-containers/addon-resizer      | functional-634000 | ffd4cfbbe753e | 32.9MB |
| registry.k8s.io/pause                       | 3.3               | 0184c1613d929 | 683kB  |
| docker.io/library/nginx                     | latest            | 247f7abff9f70 | 187MB  |
| gcr.io/k8s-minikube/busybox                 | 1.28.4-glibc      | 56cc512116c8f | 4.4MB  |
| registry.k8s.io/echoserver                  | 1.8               | 82e4c8a736a4f | 95.4MB |
|---------------------------------------------|-------------------|---------------|--------|
functional_test.go:268: (dbg) Stderr: out/minikube-darwin-amd64 -p functional-634000 image ls --format table --alsologtostderr:
I0213 15:01:15.758625    5111 out.go:291] Setting OutFile to fd 1 ...
I0213 15:01:15.758832    5111 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0213 15:01:15.758837    5111 out.go:304] Setting ErrFile to fd 2...
I0213 15:01:15.758841    5111 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0213 15:01:15.759026    5111 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18169-2790/.minikube/bin
I0213 15:01:15.759707    5111 config.go:182] Loaded profile config "functional-634000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
I0213 15:01:15.759811    5111 config.go:182] Loaded profile config "functional-634000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
I0213 15:01:15.760201    5111 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0213 15:01:15.760244    5111 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0213 15:01:15.768021    5111 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51310
I0213 15:01:15.768504    5111 main.go:141] libmachine: () Calling .GetVersion
I0213 15:01:15.769026    5111 main.go:141] libmachine: Using API Version  1
I0213 15:01:15.769053    5111 main.go:141] libmachine: () Calling .SetConfigRaw
I0213 15:01:15.769301    5111 main.go:141] libmachine: () Calling .GetMachineName
I0213 15:01:15.769422    5111 main.go:141] libmachine: (functional-634000) Calling .GetState
I0213 15:01:15.769517    5111 main.go:141] libmachine: (functional-634000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0213 15:01:15.769578    5111 main.go:141] libmachine: (functional-634000) DBG | hyperkit pid from json: 4213
I0213 15:01:15.770821    5111 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0213 15:01:15.770844    5111 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0213 15:01:15.778920    5111 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51312
I0213 15:01:15.779282    5111 main.go:141] libmachine: () Calling .GetVersion
I0213 15:01:15.779655    5111 main.go:141] libmachine: Using API Version  1
I0213 15:01:15.779674    5111 main.go:141] libmachine: () Calling .SetConfigRaw
I0213 15:01:15.779884    5111 main.go:141] libmachine: () Calling .GetMachineName
I0213 15:01:15.780001    5111 main.go:141] libmachine: (functional-634000) Calling .DriverName
I0213 15:01:15.780167    5111 ssh_runner.go:195] Run: systemctl --version
I0213 15:01:15.780191    5111 main.go:141] libmachine: (functional-634000) Calling .GetSSHHostname
I0213 15:01:15.780281    5111 main.go:141] libmachine: (functional-634000) Calling .GetSSHPort
I0213 15:01:15.780358    5111 main.go:141] libmachine: (functional-634000) Calling .GetSSHKeyPath
I0213 15:01:15.780430    5111 main.go:141] libmachine: (functional-634000) Calling .GetSSHUsername
I0213 15:01:15.780529    5111 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/functional-634000/id_rsa Username:docker}
I0213 15:01:15.813764    5111 ssh_runner.go:195] Run: docker images --no-trunc --format "{{json .}}"
I0213 15:01:15.834462    5111 main.go:141] libmachine: Making call to close driver server
I0213 15:01:15.834472    5111 main.go:141] libmachine: (functional-634000) Calling .Close
I0213 15:01:15.834650    5111 main.go:141] libmachine: Successfully made call to close driver server
I0213 15:01:15.834652    5111 main.go:141] libmachine: (functional-634000) DBG | Closing plugin on server side
I0213 15:01:15.834659    5111 main.go:141] libmachine: Making call to close connection to plugin binary
I0213 15:01:15.834672    5111 main.go:141] libmachine: Making call to close driver server
I0213 15:01:15.834680    5111 main.go:141] libmachine: (functional-634000) Calling .Close
I0213 15:01:15.834824    5111 main.go:141] libmachine: Successfully made call to close driver server
I0213 15:01:15.834832    5111 main.go:141] libmachine: (functional-634000) DBG | Closing plugin on server side
I0213 15:01:15.834837    5111 main.go:141] libmachine: Making call to close connection to plugin binary
--- PASS: TestFunctional/parallel/ImageCommands/ImageListTable (0.16s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListJson (0.16s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListJson
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListJson

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListJson
functional_test.go:260: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 image ls --format json --alsologtostderr
functional_test.go:265: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-634000 image ls --format json --alsologtostderr:
[{"id":"2b70e4aaac6b5370bf3a556f5e13156692351696dd5d7c5530d117aa21772748","repoDigests":[],"repoTags":["docker.io/library/nginx:alpine"],"size":"42600000"},{"id":"ffd4cfbbe753e62419e129ee2ac618beb94e51baa7471df5038b0b516b59cf91","repoDigests":[],"repoTags":["gcr.io/google-containers/addon-resizer:functional-634000"],"size":"32900000"},{"id":"0184c1613d92931126feb4c548e5da11015513b9e4c104e7305ee8b53b50a9da","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.3"],"size":"683000"},{"id":"350b164e7ae1dcddeffadd65c76226c9b6dc5553f5179153fb0e36b78f2a5e06","repoDigests":[],"repoTags":["registry.k8s.io/pause:latest"],"size":"240000"},{"id":"e3db313c6dbc065d4ac3b32c7a6f2a878949031b881d217b63881a109c5cfba1","repoDigests":[],"repoTags":["registry.k8s.io/kube-scheduler:v1.28.4"],"size":"60100000"},{"id":"83f6cc407eed88d214aad97f3539bde5c8e485ff14424cd021a3a2899304398e","repoDigests":[],"repoTags":["registry.k8s.io/kube-proxy:v1.28.4"],"size":"73200000"},{"id":"ead0a4a53df89fd173874b46093b6e62d8c72967bbf606d672c
9e8c9b601a4fc","repoDigests":[],"repoTags":["registry.k8s.io/coredns/coredns:v1.10.1"],"size":"53600000"},{"id":"07655ddf2eebe5d250f7a72c25f638b27126805d61779741b4e62e69ba080558","repoDigests":[],"repoTags":["docker.io/kubernetesui/dashboard:\u003cnone\u003e"],"size":"246000000"},{"id":"56cc512116c8f894f11ce1995460aef1ee0972d48bc2a3bdb1faaac7c020289c","repoDigests":[],"repoTags":["gcr.io/k8s-minikube/busybox:1.28.4-glibc"],"size":"4400000"},{"id":"da86e6ba6ca197bf6bc5e9d900febd906b133eaa4750e6bed647b0fbe50ed43e","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.1"],"size":"742000"},{"id":"7fe0e6f37db33464725e616a12ccc4e36970370005a2b09683a974db6350c257","repoDigests":[],"repoTags":["registry.k8s.io/kube-apiserver:v1.28.4"],"size":"126000000"},{"id":"247f7abff9f7097bbdab57df76fedd124d1e24a6ec4944fb5ef0ad128997ce05","repoDigests":[],"repoTags":["docker.io/library/nginx:latest"],"size":"187000000"},{"id":"e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c","repoDigests":[],"repoTags":["registry
.k8s.io/pause:3.9"],"size":"744000"},{"id":"82e4c8a736a4fcf22b5ef9f6a4ff6207064c7187d7694bf97bd561605a538410","repoDigests":[],"repoTags":["registry.k8s.io/echoserver:1.8"],"size":"95400000"},{"id":"8df0b0e72a01edc1c930de605a91707a921a70d7d035c0ba9db49b7b7115a4f3","repoDigests":[],"repoTags":["docker.io/library/minikube-local-cache-test:functional-634000"],"size":"30"},{"id":"5107333e08a87b836d48ff7528b1e84b9c86781cc9f1748bbc1b8c42a870d933","repoDigests":[],"repoTags":["docker.io/library/mysql:5.7"],"size":"501000000"},{"id":"d058aa5ab969ce7b84d25e7188be1f80633b18db8ea7d02d9d0a78e676236591","repoDigests":[],"repoTags":["registry.k8s.io/kube-controller-manager:v1.28.4"],"size":"122000000"},{"id":"73deb9a3f702532592a4167455f8bf2e5f5d900bcc959ba2fd2d35c321de1af9","repoDigests":[],"repoTags":["registry.k8s.io/etcd:3.5.9-0"],"size":"294000000"},{"id":"6e38f40d628db3002f5617342c8872c935de530d867d0f709a2fbda1a302a562","repoDigests":[],"repoTags":["gcr.io/k8s-minikube/storage-provisioner:v5"],"size":"31500000"}]
functional_test.go:268: (dbg) Stderr: out/minikube-darwin-amd64 -p functional-634000 image ls --format json --alsologtostderr:
I0213 15:01:15.598243    5107 out.go:291] Setting OutFile to fd 1 ...
I0213 15:01:15.598430    5107 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0213 15:01:15.598435    5107 out.go:304] Setting ErrFile to fd 2...
I0213 15:01:15.598440    5107 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0213 15:01:15.598644    5107 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18169-2790/.minikube/bin
I0213 15:01:15.599334    5107 config.go:182] Loaded profile config "functional-634000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
I0213 15:01:15.599430    5107 config.go:182] Loaded profile config "functional-634000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
I0213 15:01:15.599800    5107 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0213 15:01:15.599862    5107 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0213 15:01:15.607567    5107 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51305
I0213 15:01:15.607963    5107 main.go:141] libmachine: () Calling .GetVersion
I0213 15:01:15.608392    5107 main.go:141] libmachine: Using API Version  1
I0213 15:01:15.608405    5107 main.go:141] libmachine: () Calling .SetConfigRaw
I0213 15:01:15.608635    5107 main.go:141] libmachine: () Calling .GetMachineName
I0213 15:01:15.608747    5107 main.go:141] libmachine: (functional-634000) Calling .GetState
I0213 15:01:15.608865    5107 main.go:141] libmachine: (functional-634000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0213 15:01:15.608931    5107 main.go:141] libmachine: (functional-634000) DBG | hyperkit pid from json: 4213
I0213 15:01:15.610222    5107 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0213 15:01:15.610246    5107 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0213 15:01:15.618005    5107 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51307
I0213 15:01:15.618363    5107 main.go:141] libmachine: () Calling .GetVersion
I0213 15:01:15.618698    5107 main.go:141] libmachine: Using API Version  1
I0213 15:01:15.618709    5107 main.go:141] libmachine: () Calling .SetConfigRaw
I0213 15:01:15.618947    5107 main.go:141] libmachine: () Calling .GetMachineName
I0213 15:01:15.619052    5107 main.go:141] libmachine: (functional-634000) Calling .DriverName
I0213 15:01:15.619212    5107 ssh_runner.go:195] Run: systemctl --version
I0213 15:01:15.619232    5107 main.go:141] libmachine: (functional-634000) Calling .GetSSHHostname
I0213 15:01:15.619321    5107 main.go:141] libmachine: (functional-634000) Calling .GetSSHPort
I0213 15:01:15.619438    5107 main.go:141] libmachine: (functional-634000) Calling .GetSSHKeyPath
I0213 15:01:15.619523    5107 main.go:141] libmachine: (functional-634000) Calling .GetSSHUsername
I0213 15:01:15.619610    5107 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/functional-634000/id_rsa Username:docker}
I0213 15:01:15.654146    5107 ssh_runner.go:195] Run: docker images --no-trunc --format "{{json .}}"
I0213 15:01:15.677679    5107 main.go:141] libmachine: Making call to close driver server
I0213 15:01:15.677704    5107 main.go:141] libmachine: (functional-634000) Calling .Close
I0213 15:01:15.677859    5107 main.go:141] libmachine: Successfully made call to close driver server
I0213 15:01:15.677867    5107 main.go:141] libmachine: Making call to close connection to plugin binary
I0213 15:01:15.677877    5107 main.go:141] libmachine: Making call to close driver server
I0213 15:01:15.677887    5107 main.go:141] libmachine: (functional-634000) Calling .Close
I0213 15:01:15.678028    5107 main.go:141] libmachine: (functional-634000) DBG | Closing plugin on server side
I0213 15:01:15.678037    5107 main.go:141] libmachine: Successfully made call to close driver server
I0213 15:01:15.678048    5107 main.go:141] libmachine: Making call to close connection to plugin binary
--- PASS: TestFunctional/parallel/ImageCommands/ImageListJson (0.16s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListYaml (0.17s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListYaml
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListYaml

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListYaml
functional_test.go:260: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 image ls --format yaml --alsologtostderr
functional_test.go:265: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-634000 image ls --format yaml --alsologtostderr:
- id: ead0a4a53df89fd173874b46093b6e62d8c72967bbf606d672c9e8c9b601a4fc
repoDigests: []
repoTags:
- registry.k8s.io/coredns/coredns:v1.10.1
size: "53600000"
- id: e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.9
size: "744000"
- id: 6e38f40d628db3002f5617342c8872c935de530d867d0f709a2fbda1a302a562
repoDigests: []
repoTags:
- gcr.io/k8s-minikube/storage-provisioner:v5
size: "31500000"
- id: ffd4cfbbe753e62419e129ee2ac618beb94e51baa7471df5038b0b516b59cf91
repoDigests: []
repoTags:
- gcr.io/google-containers/addon-resizer:functional-634000
size: "32900000"
- id: 0184c1613d92931126feb4c548e5da11015513b9e4c104e7305ee8b53b50a9da
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.3
size: "683000"
- id: 5107333e08a87b836d48ff7528b1e84b9c86781cc9f1748bbc1b8c42a870d933
repoDigests: []
repoTags:
- docker.io/library/mysql:5.7
size: "501000000"
- id: e3db313c6dbc065d4ac3b32c7a6f2a878949031b881d217b63881a109c5cfba1
repoDigests: []
repoTags:
- registry.k8s.io/kube-scheduler:v1.28.4
size: "60100000"
- id: 2b70e4aaac6b5370bf3a556f5e13156692351696dd5d7c5530d117aa21772748
repoDigests: []
repoTags:
- docker.io/library/nginx:alpine
size: "42600000"
- id: 56cc512116c8f894f11ce1995460aef1ee0972d48bc2a3bdb1faaac7c020289c
repoDigests: []
repoTags:
- gcr.io/k8s-minikube/busybox:1.28.4-glibc
size: "4400000"
- id: 8df0b0e72a01edc1c930de605a91707a921a70d7d035c0ba9db49b7b7115a4f3
repoDigests: []
repoTags:
- docker.io/library/minikube-local-cache-test:functional-634000
size: "30"
- id: 83f6cc407eed88d214aad97f3539bde5c8e485ff14424cd021a3a2899304398e
repoDigests: []
repoTags:
- registry.k8s.io/kube-proxy:v1.28.4
size: "73200000"
- id: da86e6ba6ca197bf6bc5e9d900febd906b133eaa4750e6bed647b0fbe50ed43e
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.1
size: "742000"
- id: 350b164e7ae1dcddeffadd65c76226c9b6dc5553f5179153fb0e36b78f2a5e06
repoDigests: []
repoTags:
- registry.k8s.io/pause:latest
size: "240000"
- id: 247f7abff9f7097bbdab57df76fedd124d1e24a6ec4944fb5ef0ad128997ce05
repoDigests: []
repoTags:
- docker.io/library/nginx:latest
size: "187000000"
- id: 73deb9a3f702532592a4167455f8bf2e5f5d900bcc959ba2fd2d35c321de1af9
repoDigests: []
repoTags:
- registry.k8s.io/etcd:3.5.9-0
size: "294000000"
- id: 82e4c8a736a4fcf22b5ef9f6a4ff6207064c7187d7694bf97bd561605a538410
repoDigests: []
repoTags:
- registry.k8s.io/echoserver:1.8
size: "95400000"
- id: 7fe0e6f37db33464725e616a12ccc4e36970370005a2b09683a974db6350c257
repoDigests: []
repoTags:
- registry.k8s.io/kube-apiserver:v1.28.4
size: "126000000"
- id: d058aa5ab969ce7b84d25e7188be1f80633b18db8ea7d02d9d0a78e676236591
repoDigests: []
repoTags:
- registry.k8s.io/kube-controller-manager:v1.28.4
size: "122000000"

                                                
                                                
functional_test.go:268: (dbg) Stderr: out/minikube-darwin-amd64 -p functional-634000 image ls --format yaml --alsologtostderr:
I0213 15:01:10.044833    5089 out.go:291] Setting OutFile to fd 1 ...
I0213 15:01:10.045144    5089 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0213 15:01:10.045150    5089 out.go:304] Setting ErrFile to fd 2...
I0213 15:01:10.045155    5089 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0213 15:01:10.045404    5089 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18169-2790/.minikube/bin
I0213 15:01:10.046052    5089 config.go:182] Loaded profile config "functional-634000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
I0213 15:01:10.046146    5089 config.go:182] Loaded profile config "functional-634000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
I0213 15:01:10.046599    5089 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0213 15:01:10.046653    5089 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0213 15:01:10.055080    5089 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51285
I0213 15:01:10.055523    5089 main.go:141] libmachine: () Calling .GetVersion
I0213 15:01:10.055953    5089 main.go:141] libmachine: Using API Version  1
I0213 15:01:10.055964    5089 main.go:141] libmachine: () Calling .SetConfigRaw
I0213 15:01:10.056213    5089 main.go:141] libmachine: () Calling .GetMachineName
I0213 15:01:10.056331    5089 main.go:141] libmachine: (functional-634000) Calling .GetState
I0213 15:01:10.056427    5089 main.go:141] libmachine: (functional-634000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0213 15:01:10.056485    5089 main.go:141] libmachine: (functional-634000) DBG | hyperkit pid from json: 4213
I0213 15:01:10.057865    5089 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0213 15:01:10.057889    5089 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0213 15:01:10.065918    5089 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51287
I0213 15:01:10.066273    5089 main.go:141] libmachine: () Calling .GetVersion
I0213 15:01:10.066646    5089 main.go:141] libmachine: Using API Version  1
I0213 15:01:10.066658    5089 main.go:141] libmachine: () Calling .SetConfigRaw
I0213 15:01:10.066876    5089 main.go:141] libmachine: () Calling .GetMachineName
I0213 15:01:10.066973    5089 main.go:141] libmachine: (functional-634000) Calling .DriverName
I0213 15:01:10.067133    5089 ssh_runner.go:195] Run: systemctl --version
I0213 15:01:10.067156    5089 main.go:141] libmachine: (functional-634000) Calling .GetSSHHostname
I0213 15:01:10.067257    5089 main.go:141] libmachine: (functional-634000) Calling .GetSSHPort
I0213 15:01:10.067347    5089 main.go:141] libmachine: (functional-634000) Calling .GetSSHKeyPath
I0213 15:01:10.067426    5089 main.go:141] libmachine: (functional-634000) Calling .GetSSHUsername
I0213 15:01:10.067517    5089 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/functional-634000/id_rsa Username:docker}
I0213 15:01:10.104015    5089 ssh_runner.go:195] Run: docker images --no-trunc --format "{{json .}}"
I0213 15:01:10.135221    5089 main.go:141] libmachine: Making call to close driver server
I0213 15:01:10.135232    5089 main.go:141] libmachine: (functional-634000) Calling .Close
I0213 15:01:10.135377    5089 main.go:141] libmachine: (functional-634000) DBG | Closing plugin on server side
I0213 15:01:10.135378    5089 main.go:141] libmachine: Successfully made call to close driver server
I0213 15:01:10.135390    5089 main.go:141] libmachine: Making call to close connection to plugin binary
I0213 15:01:10.135397    5089 main.go:141] libmachine: Making call to close driver server
I0213 15:01:10.135403    5089 main.go:141] libmachine: (functional-634000) Calling .Close
I0213 15:01:10.135563    5089 main.go:141] libmachine: (functional-634000) DBG | Closing plugin on server side
I0213 15:01:10.135582    5089 main.go:141] libmachine: Successfully made call to close driver server
I0213 15:01:10.135600    5089 main.go:141] libmachine: Making call to close connection to plugin binary
--- PASS: TestFunctional/parallel/ImageCommands/ImageListYaml (0.17s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageBuild (6.91s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageBuild
=== PAUSE TestFunctional/parallel/ImageCommands/ImageBuild

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageBuild
functional_test.go:307: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh pgrep buildkitd
functional_test.go:307: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-634000 ssh pgrep buildkitd: exit status 1 (142.548548ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:314: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 image build -t localhost/my-image:functional-634000 testdata/build --alsologtostderr
2024/02/13 15:01:15 [DEBUG] GET http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/
functional_test.go:314: (dbg) Done: out/minikube-darwin-amd64 -p functional-634000 image build -t localhost/my-image:functional-634000 testdata/build --alsologtostderr: (6.616544609s)
functional_test.go:319: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-634000 image build -t localhost/my-image:functional-634000 testdata/build --alsologtostderr:
Sending build context to Docker daemon  3.072kB

Step 1/3 : FROM gcr.io/k8s-minikube/busybox
latest: Pulling from k8s-minikube/busybox
5cc84ad355aa: Pulling fs layer
5cc84ad355aa: Verifying Checksum
5cc84ad355aa: Download complete
5cc84ad355aa: Pull complete
Digest: sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
Status: Downloaded newer image for gcr.io/k8s-minikube/busybox:latest
---> beae173ccac6
Step 2/3 : RUN true
---> Running in ce09ff438319
Removing intermediate container ce09ff438319
---> 2cb7e041d2a4
Step 3/3 : ADD content.txt /
---> b651cfd3f91b
Successfully built b651cfd3f91b
Successfully tagged localhost/my-image:functional-634000
functional_test.go:322: (dbg) Stderr: out/minikube-darwin-amd64 -p functional-634000 image build -t localhost/my-image:functional-634000 testdata/build --alsologtostderr:
I0213 15:01:10.359307    5099 out.go:291] Setting OutFile to fd 1 ...
I0213 15:01:10.359684    5099 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0213 15:01:10.359691    5099 out.go:304] Setting ErrFile to fd 2...
I0213 15:01:10.359695    5099 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0213 15:01:10.359875    5099 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18169-2790/.minikube/bin
I0213 15:01:10.360471    5099 config.go:182] Loaded profile config "functional-634000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
I0213 15:01:10.361952    5099 config.go:182] Loaded profile config "functional-634000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
I0213 15:01:10.362309    5099 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0213 15:01:10.362355    5099 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0213 15:01:10.370670    5099 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51297
I0213 15:01:10.371135    5099 main.go:141] libmachine: () Calling .GetVersion
I0213 15:01:10.371596    5099 main.go:141] libmachine: Using API Version  1
I0213 15:01:10.371607    5099 main.go:141] libmachine: () Calling .SetConfigRaw
I0213 15:01:10.371827    5099 main.go:141] libmachine: () Calling .GetMachineName
I0213 15:01:10.371946    5099 main.go:141] libmachine: (functional-634000) Calling .GetState
I0213 15:01:10.372035    5099 main.go:141] libmachine: (functional-634000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0213 15:01:10.372112    5099 main.go:141] libmachine: (functional-634000) DBG | hyperkit pid from json: 4213
I0213 15:01:10.373538    5099 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0213 15:01:10.373564    5099 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0213 15:01:10.381991    5099 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51299
I0213 15:01:10.382449    5099 main.go:141] libmachine: () Calling .GetVersion
I0213 15:01:10.382852    5099 main.go:141] libmachine: Using API Version  1
I0213 15:01:10.382869    5099 main.go:141] libmachine: () Calling .SetConfigRaw
I0213 15:01:10.383182    5099 main.go:141] libmachine: () Calling .GetMachineName
I0213 15:01:10.383356    5099 main.go:141] libmachine: (functional-634000) Calling .DriverName
I0213 15:01:10.383550    5099 ssh_runner.go:195] Run: systemctl --version
I0213 15:01:10.383574    5099 main.go:141] libmachine: (functional-634000) Calling .GetSSHHostname
I0213 15:01:10.383688    5099 main.go:141] libmachine: (functional-634000) Calling .GetSSHPort
I0213 15:01:10.383816    5099 main.go:141] libmachine: (functional-634000) Calling .GetSSHKeyPath
I0213 15:01:10.383924    5099 main.go:141] libmachine: (functional-634000) Calling .GetSSHUsername
I0213 15:01:10.384049    5099 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/functional-634000/id_rsa Username:docker}
I0213 15:01:10.436303    5099 build_images.go:151] Building image from path: /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/build.2897731711.tar
I0213 15:01:10.436404    5099 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build
I0213 15:01:10.447653    5099 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/build/build.2897731711.tar
I0213 15:01:10.453732    5099 ssh_runner.go:352] existence check for /var/lib/minikube/build/build.2897731711.tar: stat -c "%s %y" /var/lib/minikube/build/build.2897731711.tar: Process exited with status 1
stdout:

                                                
                                                
stderr:
stat: cannot statx '/var/lib/minikube/build/build.2897731711.tar': No such file or directory
I0213 15:01:10.453766    5099 ssh_runner.go:362] scp /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/build.2897731711.tar --> /var/lib/minikube/build/build.2897731711.tar (3072 bytes)
I0213 15:01:10.490494    5099 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build/build.2897731711
I0213 15:01:10.499717    5099 ssh_runner.go:195] Run: sudo tar -C /var/lib/minikube/build/build.2897731711 -xf /var/lib/minikube/build/build.2897731711.tar
I0213 15:01:10.508407    5099 docker.go:360] Building image: /var/lib/minikube/build/build.2897731711
I0213 15:01:10.508504    5099 ssh_runner.go:195] Run: docker build -t localhost/my-image:functional-634000 /var/lib/minikube/build/build.2897731711
DEPRECATED: The legacy builder is deprecated and will be removed in a future release.
Install the buildx component to build images with BuildKit:
https://docs.docker.com/go/buildx/

                                                
                                                
I0213 15:01:16.867927    5099 ssh_runner.go:235] Completed: docker build -t localhost/my-image:functional-634000 /var/lib/minikube/build/build.2897731711: (6.359606682s)
I0213 15:01:16.867987    5099 ssh_runner.go:195] Run: sudo rm -rf /var/lib/minikube/build/build.2897731711
I0213 15:01:16.875228    5099 ssh_runner.go:195] Run: sudo rm -f /var/lib/minikube/build/build.2897731711.tar
I0213 15:01:16.882753    5099 build_images.go:207] Built localhost/my-image:functional-634000 from /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/build.2897731711.tar
I0213 15:01:16.882776    5099 build_images.go:123] succeeded building to: functional-634000
I0213 15:01:16.882780    5099 build_images.go:124] failed building to: 
I0213 15:01:16.882794    5099 main.go:141] libmachine: Making call to close driver server
I0213 15:01:16.882800    5099 main.go:141] libmachine: (functional-634000) Calling .Close
I0213 15:01:16.882932    5099 main.go:141] libmachine: Successfully made call to close driver server
I0213 15:01:16.882941    5099 main.go:141] libmachine: (functional-634000) DBG | Closing plugin on server side
I0213 15:01:16.882946    5099 main.go:141] libmachine: Making call to close connection to plugin binary
I0213 15:01:16.882955    5099 main.go:141] libmachine: Making call to close driver server
I0213 15:01:16.882962    5099 main.go:141] libmachine: (functional-634000) Calling .Close
I0213 15:01:16.883118    5099 main.go:141] libmachine: Successfully made call to close driver server
I0213 15:01:16.883127    5099 main.go:141] libmachine: Making call to close connection to plugin binary
I0213 15:01:16.883134    5099 main.go:141] libmachine: (functional-634000) DBG | Closing plugin on server side
functional_test.go:447: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageBuild (6.91s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/Setup (5.92s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/Setup
functional_test.go:341: (dbg) Run:  docker pull gcr.io/google-containers/addon-resizer:1.8.8
functional_test.go:341: (dbg) Done: docker pull gcr.io/google-containers/addon-resizer:1.8.8: (5.868355113s)
functional_test.go:346: (dbg) Run:  docker tag gcr.io/google-containers/addon-resizer:1.8.8 gcr.io/google-containers/addon-resizer:functional-634000
--- PASS: TestFunctional/parallel/ImageCommands/Setup (5.92s)

                                                
                                    
x
+
TestFunctional/parallel/DockerEnv/bash (0.71s)

                                                
                                                
=== RUN   TestFunctional/parallel/DockerEnv/bash
functional_test.go:495: (dbg) Run:  /bin/bash -c "eval $(out/minikube-darwin-amd64 -p functional-634000 docker-env) && out/minikube-darwin-amd64 status -p functional-634000"
functional_test.go:518: (dbg) Run:  /bin/bash -c "eval $(out/minikube-darwin-amd64 -p functional-634000 docker-env) && docker images"
--- PASS: TestFunctional/parallel/DockerEnv/bash (0.71s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_changes (0.19s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_changes
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_changes

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_changes
functional_test.go:2115: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_changes (0.19s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.21s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
functional_test.go:2115: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.21s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_clusters (0.19s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_clusters
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_clusters

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_clusters
functional_test.go:2115: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_clusters (0.19s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadDaemon (3.94s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadDaemon
functional_test.go:354: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 image load --daemon gcr.io/google-containers/addon-resizer:functional-634000 --alsologtostderr
functional_test.go:354: (dbg) Done: out/minikube-darwin-amd64 -p functional-634000 image load --daemon gcr.io/google-containers/addon-resizer:functional-634000 --alsologtostderr: (3.777264853s)
functional_test.go:447: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadDaemon (3.94s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageReloadDaemon (2.16s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageReloadDaemon
functional_test.go:364: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 image load --daemon gcr.io/google-containers/addon-resizer:functional-634000 --alsologtostderr
functional_test.go:364: (dbg) Done: out/minikube-darwin-amd64 -p functional-634000 image load --daemon gcr.io/google-containers/addon-resizer:functional-634000 --alsologtostderr: (2.001870022s)
functional_test.go:447: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageReloadDaemon (2.16s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (8.67s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon
functional_test.go:234: (dbg) Run:  docker pull gcr.io/google-containers/addon-resizer:1.8.9
functional_test.go:234: (dbg) Done: docker pull gcr.io/google-containers/addon-resizer:1.8.9: (5.356595434s)
functional_test.go:239: (dbg) Run:  docker tag gcr.io/google-containers/addon-resizer:1.8.9 gcr.io/google-containers/addon-resizer:functional-634000
functional_test.go:244: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 image load --daemon gcr.io/google-containers/addon-resizer:functional-634000 --alsologtostderr
functional_test.go:244: (dbg) Done: out/minikube-darwin-amd64 -p functional-634000 image load --daemon gcr.io/google-containers/addon-resizer:functional-634000 --alsologtostderr: (3.074852731s)
functional_test.go:447: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (8.67s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveToFile (1.07s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveToFile
functional_test.go:379: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 image save gcr.io/google-containers/addon-resizer:functional-634000 /Users/jenkins/workspace/addon-resizer-save.tar --alsologtostderr
functional_test.go:379: (dbg) Done: out/minikube-darwin-amd64 -p functional-634000 image save gcr.io/google-containers/addon-resizer:functional-634000 /Users/jenkins/workspace/addon-resizer-save.tar --alsologtostderr: (1.069443964s)
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveToFile (1.07s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageRemove (0.34s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageRemove
functional_test.go:391: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 image rm gcr.io/google-containers/addon-resizer:functional-634000 --alsologtostderr
functional_test.go:447: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageRemove (0.34s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadFromFile (1.4s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadFromFile
functional_test.go:408: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 image load /Users/jenkins/workspace/addon-resizer-save.tar --alsologtostderr
functional_test.go:408: (dbg) Done: out/minikube-darwin-amd64 -p functional-634000 image load /Users/jenkins/workspace/addon-resizer-save.tar --alsologtostderr: (1.251033467s)
functional_test.go:447: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadFromFile (1.40s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveDaemon (0.92s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveDaemon
functional_test.go:418: (dbg) Run:  docker rmi gcr.io/google-containers/addon-resizer:functional-634000
functional_test.go:423: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 image save --daemon gcr.io/google-containers/addon-resizer:functional-634000 --alsologtostderr
functional_test.go:428: (dbg) Run:  docker image inspect gcr.io/google-containers/addon-resizer:functional-634000
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveDaemon (0.92s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/DeployApp (16.12s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/DeployApp
functional_test.go:1435: (dbg) Run:  kubectl --context functional-634000 create deployment hello-node --image=registry.k8s.io/echoserver:1.8
functional_test.go:1441: (dbg) Run:  kubectl --context functional-634000 expose deployment hello-node --type=NodePort --port=8080
functional_test.go:1446: (dbg) TestFunctional/parallel/ServiceCmd/DeployApp: waiting 10m0s for pods matching "app=hello-node" in namespace "default" ...
helpers_test.go:344: "hello-node-d7447cc7f-45mdl" [26ea7170-289e-4946-966f-1d2565d5387c] Pending / Ready:ContainersNotReady (containers with unready status: [echoserver]) / ContainersReady:ContainersNotReady (containers with unready status: [echoserver])
helpers_test.go:344: "hello-node-d7447cc7f-45mdl" [26ea7170-289e-4946-966f-1d2565d5387c] Running
functional_test.go:1446: (dbg) TestFunctional/parallel/ServiceCmd/DeployApp: app=hello-node healthy within 16.002982843s
--- PASS: TestFunctional/parallel/ServiceCmd/DeployApp (16.12s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel (0.43s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-darwin-amd64 -p functional-634000 tunnel --alsologtostderr]
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-darwin-amd64 -p functional-634000 tunnel --alsologtostderr]
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-darwin-amd64 -p functional-634000 tunnel --alsologtostderr] ...
helpers_test.go:490: unable to find parent, assuming dead: process does not exist
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-darwin-amd64 -p functional-634000 tunnel --alsologtostderr] ...
helpers_test.go:508: unable to kill pid 4765: os: process already finished
--- PASS: TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel (0.43s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/StartTunnel
functional_test_tunnel_test.go:129: (dbg) daemon: [out/minikube-darwin-amd64 -p functional-634000 tunnel --alsologtostderr]
--- PASS: TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (10.14s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup
functional_test_tunnel_test.go:212: (dbg) Run:  kubectl --context functional-634000 apply -f testdata/testsvc.yaml
functional_test_tunnel_test.go:216: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: waiting 4m0s for pods matching "run=nginx-svc" in namespace "default" ...
helpers_test.go:344: "nginx-svc" [01836255-eb6e-43ad-b1cf-86aa3481faa3] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:344: "nginx-svc" [01836255-eb6e-43ad-b1cf-86aa3481faa3] Running
functional_test_tunnel_test.go:216: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: run=nginx-svc healthy within 10.003251101s
--- PASS: TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (10.14s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/List (0.37s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/List
functional_test.go:1455: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 service list
--- PASS: TestFunctional/parallel/ServiceCmd/List (0.37s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/JSONOutput (0.37s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/JSONOutput
functional_test.go:1485: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 service list -o json
functional_test.go:1490: Took "368.298803ms" to run "out/minikube-darwin-amd64 -p functional-634000 service list -o json"
--- PASS: TestFunctional/parallel/ServiceCmd/JSONOutput (0.37s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/HTTPS (0.24s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/HTTPS
functional_test.go:1505: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 service --namespace=default --https --url hello-node
functional_test.go:1518: found endpoint: https://192.169.0.5:30437
--- PASS: TestFunctional/parallel/ServiceCmd/HTTPS (0.24s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/Format (0.24s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/Format
functional_test.go:1536: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 service hello-node --url --format={{.IP}}
--- PASS: TestFunctional/parallel/ServiceCmd/Format (0.24s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/URL (0.25s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/URL
functional_test.go:1555: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 service hello-node --url
functional_test.go:1561: found endpoint for hello-node: http://192.169.0.5:30437
--- PASS: TestFunctional/parallel/ServiceCmd/URL (0.25s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP (0.05s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP
functional_test_tunnel_test.go:234: (dbg) Run:  kubectl --context functional-634000 get svc nginx-svc -o jsonpath={.status.loadBalancer.ingress[0].ip}
--- PASS: TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP (0.05s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessDirect
functional_test_tunnel_test.go:299: tunnel at http://10.110.205.51 is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0.04s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig
functional_test_tunnel_test.go:319: (dbg) Run:  dig +time=5 +tries=3 @10.96.0.10 nginx-svc.default.svc.cluster.local. A
functional_test_tunnel_test.go:327: DNS resolution by dig for nginx-svc.default.svc.cluster.local. is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0.04s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.03s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil
functional_test_tunnel_test.go:351: (dbg) Run:  dscacheutil -q host -a name nginx-svc.default.svc.cluster.local.
functional_test_tunnel_test.go:359: DNS resolution by dscacheutil for nginx-svc.default.svc.cluster.local. is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.03s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS
functional_test_tunnel_test.go:424: tunnel at http://nginx-svc.default.svc.cluster.local. is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.13s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel
functional_test_tunnel_test.go:434: (dbg) stopping [out/minikube-darwin-amd64 -p functional-634000 tunnel --alsologtostderr] ...
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.13s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_not_create (0.38s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_not_create
functional_test.go:1266: (dbg) Run:  out/minikube-darwin-amd64 profile lis
functional_test.go:1271: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestFunctional/parallel/ProfileCmd/profile_not_create (0.38s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_list (0.28s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_list
functional_test.go:1306: (dbg) Run:  out/minikube-darwin-amd64 profile list
functional_test.go:1311: Took "198.754288ms" to run "out/minikube-darwin-amd64 profile list"
functional_test.go:1320: (dbg) Run:  out/minikube-darwin-amd64 profile list -l
functional_test.go:1325: Took "77.898315ms" to run "out/minikube-darwin-amd64 profile list -l"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_list (0.28s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_json_output (0.29s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_json_output
functional_test.go:1357: (dbg) Run:  out/minikube-darwin-amd64 profile list -o json
functional_test.go:1362: Took "210.039039ms" to run "out/minikube-darwin-amd64 profile list -o json"
functional_test.go:1370: (dbg) Run:  out/minikube-darwin-amd64 profile list -o json --light
functional_test.go:1375: Took "79.647249ms" to run "out/minikube-darwin-amd64 profile list -o json --light"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_json_output (0.29s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/any-port (10.9s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/any-port
functional_test_mount_test.go:73: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-634000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdany-port2615357734/001:/mount-9p --alsologtostderr -v=1]
functional_test_mount_test.go:107: wrote "test-1707865249160839000" to /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdany-port2615357734/001/created-by-test
functional_test_mount_test.go:107: wrote "test-1707865249160839000" to /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdany-port2615357734/001/created-by-test-removed-by-pod
functional_test_mount_test.go:107: wrote "test-1707865249160839000" to /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdany-port2615357734/001/test-1707865249160839000
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:115: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-634000 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (155.540562ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:129: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh -- ls -la /mount-9p
functional_test_mount_test.go:133: guest mount directory contents
total 2
-rw-r--r-- 1 docker docker 24 Feb 13 23:00 created-by-test
-rw-r--r-- 1 docker docker 24 Feb 13 23:00 created-by-test-removed-by-pod
-rw-r--r-- 1 docker docker 24 Feb 13 23:00 test-1707865249160839000
functional_test_mount_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh cat /mount-9p/test-1707865249160839000
functional_test_mount_test.go:148: (dbg) Run:  kubectl --context functional-634000 replace --force -f testdata/busybox-mount-test.yaml
functional_test_mount_test.go:153: (dbg) TestFunctional/parallel/MountCmd/any-port: waiting 4m0s for pods matching "integration-test=busybox-mount" in namespace "default" ...
helpers_test.go:344: "busybox-mount" [454926a4-cfb3-48c6-9592-90dcc80bad99] Pending
helpers_test.go:344: "busybox-mount" [454926a4-cfb3-48c6-9592-90dcc80bad99] Pending / Ready:ContainersNotReady (containers with unready status: [mount-munger]) / ContainersReady:ContainersNotReady (containers with unready status: [mount-munger])
helpers_test.go:344: "busybox-mount" [454926a4-cfb3-48c6-9592-90dcc80bad99] Pending: Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
helpers_test.go:344: "busybox-mount" [454926a4-cfb3-48c6-9592-90dcc80bad99] Succeeded: Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
functional_test_mount_test.go:153: (dbg) TestFunctional/parallel/MountCmd/any-port: integration-test=busybox-mount healthy within 9.003037031s
functional_test_mount_test.go:169: (dbg) Run:  kubectl --context functional-634000 logs busybox-mount
functional_test_mount_test.go:181: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh stat /mount-9p/created-by-test
functional_test_mount_test.go:181: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh stat /mount-9p/created-by-pod
functional_test_mount_test.go:90: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:94: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-634000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdany-port2615357734/001:/mount-9p --alsologtostderr -v=1] ...
--- PASS: TestFunctional/parallel/MountCmd/any-port (10.90s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/specific-port (1.77s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/specific-port
functional_test_mount_test.go:213: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-634000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdspecific-port683869101/001:/mount-9p --alsologtostderr -v=1 --port 46464]
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-634000 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (156.262443ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:257: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh -- ls -la /mount-9p
functional_test_mount_test.go:261: guest mount directory contents
total 0
functional_test_mount_test.go:263: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-634000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdspecific-port683869101/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
functional_test_mount_test.go:264: reading mount text
functional_test_mount_test.go:278: done reading mount text
functional_test_mount_test.go:230: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:230: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-634000 ssh "sudo umount -f /mount-9p": exit status 1 (126.986807ms)

                                                
                                                
-- stdout --
	umount: /mount-9p: not mounted.

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 32

                                                
                                                
** /stderr **
functional_test_mount_test.go:232: "out/minikube-darwin-amd64 -p functional-634000 ssh \"sudo umount -f /mount-9p\"": exit status 1
functional_test_mount_test.go:234: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-634000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdspecific-port683869101/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
--- PASS: TestFunctional/parallel/MountCmd/specific-port (1.77s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/VerifyCleanup (1.89s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/VerifyCleanup
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-634000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdVerifyCleanup4097094222/001:/mount1 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-634000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdVerifyCleanup4097094222/001:/mount2 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-634000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdVerifyCleanup4097094222/001:/mount3 --alsologtostderr -v=1]
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-634000 ssh "findmnt -T" /mount1: exit status 1 (159.302877ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-634000 ssh "findmnt -T" /mount1: exit status 1 (180.062985ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh "findmnt -T" /mount2
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p functional-634000 ssh "findmnt -T" /mount3
functional_test_mount_test.go:370: (dbg) Run:  out/minikube-darwin-amd64 mount -p functional-634000 --kill=true
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-634000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdVerifyCleanup4097094222/001:/mount1 --alsologtostderr -v=1] ...
helpers_test.go:490: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-634000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdVerifyCleanup4097094222/001:/mount2 --alsologtostderr -v=1] ...
helpers_test.go:490: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-634000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdVerifyCleanup4097094222/001:/mount3 --alsologtostderr -v=1] ...
helpers_test.go:490: unable to find parent, assuming dead: process does not exist
--- PASS: TestFunctional/parallel/MountCmd/VerifyCleanup (1.89s)

                                                
                                    
x
+
TestFunctional/delete_addon-resizer_images (0.22s)

                                                
                                                
=== RUN   TestFunctional/delete_addon-resizer_images
functional_test.go:189: (dbg) Run:  docker rmi -f gcr.io/google-containers/addon-resizer:1.8.8
functional_test.go:189: (dbg) Run:  docker rmi -f gcr.io/google-containers/addon-resizer:functional-634000
--- PASS: TestFunctional/delete_addon-resizer_images (0.22s)

                                                
                                    
x
+
TestFunctional/delete_my-image_image (0.05s)

                                                
                                                
=== RUN   TestFunctional/delete_my-image_image
functional_test.go:197: (dbg) Run:  docker rmi -f localhost/my-image:functional-634000
--- PASS: TestFunctional/delete_my-image_image (0.05s)

                                                
                                    
x
+
TestFunctional/delete_minikube_cached_images (0.05s)

                                                
                                                
=== RUN   TestFunctional/delete_minikube_cached_images
functional_test.go:205: (dbg) Run:  docker rmi -f minikube-local-cache-test:functional-634000
--- PASS: TestFunctional/delete_minikube_cached_images (0.05s)

                                                
                                    
x
+
TestImageBuild/serial/Setup (36.87s)

                                                
                                                
=== RUN   TestImageBuild/serial/Setup
image_test.go:69: (dbg) Run:  out/minikube-darwin-amd64 start -p image-904000 --driver=hyperkit 
E0213 15:01:47.723047    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
image_test.go:69: (dbg) Done: out/minikube-darwin-amd64 start -p image-904000 --driver=hyperkit : (36.866785472s)
--- PASS: TestImageBuild/serial/Setup (36.87s)

                                                
                                    
x
+
TestImageBuild/serial/NormalBuild (5.38s)

                                                
                                                
=== RUN   TestImageBuild/serial/NormalBuild
image_test.go:78: (dbg) Run:  out/minikube-darwin-amd64 image build -t aaa:latest ./testdata/image-build/test-normal -p image-904000
image_test.go:78: (dbg) Done: out/minikube-darwin-amd64 image build -t aaa:latest ./testdata/image-build/test-normal -p image-904000: (5.378351237s)
--- PASS: TestImageBuild/serial/NormalBuild (5.38s)

                                                
                                    
x
+
TestImageBuild/serial/BuildWithBuildArg (0.72s)

                                                
                                                
=== RUN   TestImageBuild/serial/BuildWithBuildArg
image_test.go:99: (dbg) Run:  out/minikube-darwin-amd64 image build -t aaa:latest --build-opt=build-arg=ENV_A=test_env_str --build-opt=no-cache ./testdata/image-build/test-arg -p image-904000
--- PASS: TestImageBuild/serial/BuildWithBuildArg (0.72s)

                                                
                                    
x
+
TestImageBuild/serial/BuildWithDockerIgnore (0.27s)

                                                
                                                
=== RUN   TestImageBuild/serial/BuildWithDockerIgnore
image_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 image build -t aaa:latest ./testdata/image-build/test-normal --build-opt=no-cache -p image-904000
--- PASS: TestImageBuild/serial/BuildWithDockerIgnore (0.27s)

                                                
                                    
x
+
TestImageBuild/serial/BuildWithSpecifiedDockerfile (0.21s)

                                                
                                                
=== RUN   TestImageBuild/serial/BuildWithSpecifiedDockerfile
image_test.go:88: (dbg) Run:  out/minikube-darwin-amd64 image build -t aaa:latest -f inner/Dockerfile ./testdata/image-build/test-f -p image-904000
--- PASS: TestImageBuild/serial/BuildWithSpecifiedDockerfile (0.21s)

                                                
                                    
x
+
TestIngressAddonLegacy/StartLegacyK8sCluster (94.31s)

                                                
                                                
=== RUN   TestIngressAddonLegacy/StartLegacyK8sCluster
ingress_addon_legacy_test.go:39: (dbg) Run:  out/minikube-darwin-amd64 start -p ingress-addon-legacy-620000 --kubernetes-version=v1.18.20 --memory=4096 --wait=true --alsologtostderr -v=5 --driver=hyperkit 
ingress_addon_legacy_test.go:39: (dbg) Done: out/minikube-darwin-amd64 start -p ingress-addon-legacy-620000 --kubernetes-version=v1.18.20 --memory=4096 --wait=true --alsologtostderr -v=5 --driver=hyperkit : (1m34.31097093s)
--- PASS: TestIngressAddonLegacy/StartLegacyK8sCluster (94.31s)

                                                
                                    
x
+
TestIngressAddonLegacy/serial/ValidateIngressAddonActivation (19.29s)

                                                
                                                
=== RUN   TestIngressAddonLegacy/serial/ValidateIngressAddonActivation
ingress_addon_legacy_test.go:70: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-620000 addons enable ingress --alsologtostderr -v=5
E0213 15:04:03.870463    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
ingress_addon_legacy_test.go:70: (dbg) Done: out/minikube-darwin-amd64 -p ingress-addon-legacy-620000 addons enable ingress --alsologtostderr -v=5: (19.288561178s)
--- PASS: TestIngressAddonLegacy/serial/ValidateIngressAddonActivation (19.29s)

                                                
                                    
x
+
TestIngressAddonLegacy/serial/ValidateIngressDNSAddonActivation (0.54s)

                                                
                                                
=== RUN   TestIngressAddonLegacy/serial/ValidateIngressDNSAddonActivation
ingress_addon_legacy_test.go:79: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-620000 addons enable ingress-dns --alsologtostderr -v=5
--- PASS: TestIngressAddonLegacy/serial/ValidateIngressDNSAddonActivation (0.54s)

                                                
                                    
x
+
TestIngressAddonLegacy/serial/ValidateIngressAddons (36.79s)

                                                
                                                
=== RUN   TestIngressAddonLegacy/serial/ValidateIngressAddons
addons_test.go:207: (dbg) Run:  kubectl --context ingress-addon-legacy-620000 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s
addons_test.go:207: (dbg) Done: kubectl --context ingress-addon-legacy-620000 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s: (16.165259513s)
addons_test.go:232: (dbg) Run:  kubectl --context ingress-addon-legacy-620000 replace --force -f testdata/nginx-ingress-v1beta1.yaml
addons_test.go:245: (dbg) Run:  kubectl --context ingress-addon-legacy-620000 replace --force -f testdata/nginx-pod-svc.yaml
addons_test.go:250: (dbg) TestIngressAddonLegacy/serial/ValidateIngressAddons: waiting 8m0s for pods matching "run=nginx" in namespace "default" ...
helpers_test.go:344: "nginx" [6d01ce10-7816-4d92-8a68-7cd4625069c8] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:344: "nginx" [6d01ce10-7816-4d92-8a68-7cd4625069c8] Running
E0213 15:04:31.558086    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
addons_test.go:250: (dbg) TestIngressAddonLegacy/serial/ValidateIngressAddons: run=nginx healthy within 10.00294746s
addons_test.go:262: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-620000 ssh "curl -s http://127.0.0.1/ -H 'Host: nginx.example.com'"
addons_test.go:286: (dbg) Run:  kubectl --context ingress-addon-legacy-620000 replace --force -f testdata/ingress-dns-example-v1beta1.yaml
addons_test.go:291: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-620000 ip
addons_test.go:297: (dbg) Run:  nslookup hello-john.test 192.169.0.7
addons_test.go:306: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-620000 addons disable ingress-dns --alsologtostderr -v=1
addons_test.go:306: (dbg) Done: out/minikube-darwin-amd64 -p ingress-addon-legacy-620000 addons disable ingress-dns --alsologtostderr -v=1: (2.029520449s)
addons_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-620000 addons disable ingress --alsologtostderr -v=1
addons_test.go:311: (dbg) Done: out/minikube-darwin-amd64 -p ingress-addon-legacy-620000 addons disable ingress --alsologtostderr -v=1: (7.337549159s)
--- PASS: TestIngressAddonLegacy/serial/ValidateIngressAddons (36.79s)

                                                
                                    
x
+
TestJSONOutput/start/Command (49.74s)

                                                
                                                
=== RUN   TestJSONOutput/start/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 start -p json-output-100000 --output=json --user=testUser --memory=2200 --wait=true --driver=hyperkit 
E0213 15:04:59.892703    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
E0213 15:04:59.898764    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
E0213 15:04:59.909383    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
E0213 15:04:59.929698    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
E0213 15:04:59.969834    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
E0213 15:05:00.050364    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
E0213 15:05:00.210788    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
E0213 15:05:00.532178    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
E0213 15:05:01.173760    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
E0213 15:05:02.454123    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
E0213 15:05:05.015120    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
E0213 15:05:10.136323    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
E0213 15:05:20.377617    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
json_output_test.go:63: (dbg) Done: out/minikube-darwin-amd64 start -p json-output-100000 --output=json --user=testUser --memory=2200 --wait=true --driver=hyperkit : (49.734681504s)
--- PASS: TestJSONOutput/start/Command (49.74s)

                                                
                                    
x
+
TestJSONOutput/start/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/Audit
--- PASS: TestJSONOutput/start/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/start/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/start/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/Command (0.49s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 pause -p json-output-100000 --output=json --user=testUser
--- PASS: TestJSONOutput/pause/Command (0.49s)

                                                
                                    
x
+
TestJSONOutput/pause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Audit
--- PASS: TestJSONOutput/pause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/Command (0.44s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 unpause -p json-output-100000 --output=json --user=testUser
--- PASS: TestJSONOutput/unpause/Command (0.44s)

                                                
                                    
x
+
TestJSONOutput/unpause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Audit
--- PASS: TestJSONOutput/unpause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/Command (8.15s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 stop -p json-output-100000 --output=json --user=testUser
E0213 15:05:40.857289    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
json_output_test.go:63: (dbg) Done: out/minikube-darwin-amd64 stop -p json-output-100000 --output=json --user=testUser: (8.1540983s)
--- PASS: TestJSONOutput/stop/Command (8.15s)

                                                
                                    
x
+
TestJSONOutput/stop/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Audit
--- PASS: TestJSONOutput/stop/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestErrorJSONOutput (0.78s)

                                                
                                                
=== RUN   TestErrorJSONOutput
json_output_test.go:160: (dbg) Run:  out/minikube-darwin-amd64 start -p json-output-error-518000 --memory=2200 --output=json --wait=true --driver=fail
json_output_test.go:160: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p json-output-error-518000 --memory=2200 --output=json --wait=true --driver=fail: exit status 56 (402.815066ms)

                                                
                                                
-- stdout --
	{"specversion":"1.0","id":"b9fc1f18-bef9-46eb-829b-a37c0269deca","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"0","message":"[json-output-error-518000] minikube v1.32.0 on Darwin 14.3.1","name":"Initial Minikube Setup","totalsteps":"19"}}
	{"specversion":"1.0","id":"f6ff83f4-70a3-438d-8e5c-c4f13ea966da","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_LOCATION=18169"}}
	{"specversion":"1.0","id":"a21c43ab-466a-4320-b0e9-7b2109d16978","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"KUBECONFIG=/Users/jenkins/minikube-integration/18169-2790/kubeconfig"}}
	{"specversion":"1.0","id":"7d3c8fd2-36df-4386-9d73-d7ff426ff1c2","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_BIN=out/minikube-darwin-amd64"}}
	{"specversion":"1.0","id":"8c4e0788-4c7b-4f49-a92d-9586fa9f3dd3","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true"}}
	{"specversion":"1.0","id":"69ab6f85-2e0f-4b81-b2cf-c54337e7aa71","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_HOME=/Users/jenkins/minikube-integration/18169-2790/.minikube"}}
	{"specversion":"1.0","id":"e999f1a4-029a-4e8c-b140-6a2d63bb3574","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_FORCE_SYSTEMD="}}
	{"specversion":"1.0","id":"ec776c59-e43d-48ec-8f54-b5c71ab85c5e","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.error","datacontenttype":"application/json","data":{"advice":"","exitcode":"56","issues":"","message":"The driver 'fail' is not supported on darwin/amd64","name":"DRV_UNSUPPORTED_OS","url":""}}

                                                
                                                
-- /stdout --
helpers_test.go:175: Cleaning up "json-output-error-518000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p json-output-error-518000
--- PASS: TestErrorJSONOutput (0.78s)

                                                
                                    
x
+
TestMainNoArgs (0.08s)

                                                
                                                
=== RUN   TestMainNoArgs
main_test.go:68: (dbg) Run:  out/minikube-darwin-amd64
--- PASS: TestMainNoArgs (0.08s)

                                                
                                    
x
+
TestMinikubeProfile (85.76s)

                                                
                                                
=== RUN   TestMinikubeProfile
minikube_profile_test.go:44: (dbg) Run:  out/minikube-darwin-amd64 start -p first-464000 --driver=hyperkit 
E0213 15:06:21.816429    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
minikube_profile_test.go:44: (dbg) Done: out/minikube-darwin-amd64 start -p first-464000 --driver=hyperkit : (37.305395028s)
minikube_profile_test.go:44: (dbg) Run:  out/minikube-darwin-amd64 start -p second-465000 --driver=hyperkit 
minikube_profile_test.go:44: (dbg) Done: out/minikube-darwin-amd64 start -p second-465000 --driver=hyperkit : (36.96770641s)
minikube_profile_test.go:51: (dbg) Run:  out/minikube-darwin-amd64 profile first-464000
minikube_profile_test.go:55: (dbg) Run:  out/minikube-darwin-amd64 profile list -ojson
minikube_profile_test.go:51: (dbg) Run:  out/minikube-darwin-amd64 profile second-465000
minikube_profile_test.go:55: (dbg) Run:  out/minikube-darwin-amd64 profile list -ojson
helpers_test.go:175: Cleaning up "second-465000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p second-465000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p second-465000: (5.317453246s)
helpers_test.go:175: Cleaning up "first-464000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p first-464000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p first-464000: (5.342891999s)
--- PASS: TestMinikubeProfile (85.76s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountFirst (16.77s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountFirst
mount_start_test.go:98: (dbg) Run:  out/minikube-darwin-amd64 start -p mount-start-1-997000 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=hyperkit 
mount_start_test.go:98: (dbg) Done: out/minikube-darwin-amd64 start -p mount-start-1-997000 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=hyperkit : (15.766584166s)
--- PASS: TestMountStart/serial/StartWithMountFirst (16.77s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountFirst (0.31s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountFirst
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-1-997000 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-1-997000 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountFirst (0.31s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountSecond (16.79s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountSecond
mount_start_test.go:98: (dbg) Run:  out/minikube-darwin-amd64 start -p mount-start-2-010000 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=hyperkit 
E0213 15:07:43.811430    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
mount_start_test.go:98: (dbg) Done: out/minikube-darwin-amd64 start -p mount-start-2-010000 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=hyperkit : (15.78583007s)
--- PASS: TestMountStart/serial/StartWithMountSecond (16.79s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountSecond (0.31s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountSecond
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-010000 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-010000 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountSecond (0.31s)

                                                
                                    
x
+
TestMountStart/serial/DeleteFirst (2.41s)

                                                
                                                
=== RUN   TestMountStart/serial/DeleteFirst
pause_test.go:132: (dbg) Run:  out/minikube-darwin-amd64 delete -p mount-start-1-997000 --alsologtostderr -v=5
pause_test.go:132: (dbg) Done: out/minikube-darwin-amd64 delete -p mount-start-1-997000 --alsologtostderr -v=5: (2.40683369s)
--- PASS: TestMountStart/serial/DeleteFirst (2.41s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostDelete (0.31s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostDelete
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-010000 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-010000 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountPostDelete (0.31s)

                                                
                                    
x
+
TestMountStart/serial/Stop (2.25s)

                                                
                                                
=== RUN   TestMountStart/serial/Stop
mount_start_test.go:155: (dbg) Run:  out/minikube-darwin-amd64 stop -p mount-start-2-010000
mount_start_test.go:155: (dbg) Done: out/minikube-darwin-amd64 stop -p mount-start-2-010000: (2.24523273s)
--- PASS: TestMountStart/serial/Stop (2.25s)

                                                
                                    
x
+
TestMountStart/serial/RestartStopped (18.01s)

                                                
                                                
=== RUN   TestMountStart/serial/RestartStopped
mount_start_test.go:166: (dbg) Run:  out/minikube-darwin-amd64 start -p mount-start-2-010000
mount_start_test.go:166: (dbg) Done: out/minikube-darwin-amd64 start -p mount-start-2-010000: (17.007227164s)
--- PASS: TestMountStart/serial/RestartStopped (18.01s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostStop (0.29s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostStop
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-010000 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-010000 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountPostStop (0.29s)

                                                
                                    
x
+
TestMultiNode/serial/FreshStart2Nodes (101.03s)

                                                
                                                
=== RUN   TestMultiNode/serial/FreshStart2Nodes
multinode_test.go:86: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-464000 --wait=true --memory=2200 --nodes=2 -v=8 --alsologtostderr --driver=hyperkit 
E0213 15:09:03.939093    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
E0213 15:09:05.654367    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
E0213 15:09:05.660187    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
E0213 15:09:05.672243    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
E0213 15:09:05.693133    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
E0213 15:09:05.733974    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
E0213 15:09:05.814101    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
E0213 15:09:05.975477    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
E0213 15:09:06.295585    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
E0213 15:09:06.936412    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
E0213 15:09:08.218489    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
E0213 15:09:10.778956    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
E0213 15:09:15.899222    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
E0213 15:09:26.139296    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
E0213 15:09:46.620460    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
multinode_test.go:86: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-464000 --wait=true --memory=2200 --nodes=2 -v=8 --alsologtostderr --driver=hyperkit : (1m40.787043015s)
multinode_test.go:92: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 status --alsologtostderr
--- PASS: TestMultiNode/serial/FreshStart2Nodes (101.03s)

                                                
                                    
x
+
TestMultiNode/serial/DeployApp2Nodes (9.2s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeployApp2Nodes
multinode_test.go:509: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-464000 -- apply -f ./testdata/multinodes/multinode-pod-dns-test.yaml
multinode_test.go:514: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-464000 -- rollout status deployment/busybox
E0213 15:09:59.959974    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
multinode_test.go:514: (dbg) Done: out/minikube-darwin-amd64 kubectl -p multinode-464000 -- rollout status deployment/busybox: (7.36931465s)
multinode_test.go:521: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-464000 -- get pods -o jsonpath='{.items[*].status.podIP}'
multinode_test.go:544: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-464000 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:552: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-464000 -- exec busybox-5b5d89c9d6-62mlv -- nslookup kubernetes.io
multinode_test.go:552: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-464000 -- exec busybox-5b5d89c9d6-tp5vk -- nslookup kubernetes.io
multinode_test.go:562: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-464000 -- exec busybox-5b5d89c9d6-62mlv -- nslookup kubernetes.default
multinode_test.go:562: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-464000 -- exec busybox-5b5d89c9d6-tp5vk -- nslookup kubernetes.default
multinode_test.go:570: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-464000 -- exec busybox-5b5d89c9d6-62mlv -- nslookup kubernetes.default.svc.cluster.local
multinode_test.go:570: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-464000 -- exec busybox-5b5d89c9d6-tp5vk -- nslookup kubernetes.default.svc.cluster.local
--- PASS: TestMultiNode/serial/DeployApp2Nodes (9.20s)

                                                
                                    
x
+
TestMultiNode/serial/PingHostFrom2Pods (0.9s)

                                                
                                                
=== RUN   TestMultiNode/serial/PingHostFrom2Pods
multinode_test.go:580: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-464000 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:588: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-464000 -- exec busybox-5b5d89c9d6-62mlv -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:599: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-464000 -- exec busybox-5b5d89c9d6-62mlv -- sh -c "ping -c 1 192.169.0.1"
multinode_test.go:588: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-464000 -- exec busybox-5b5d89c9d6-tp5vk -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:599: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-464000 -- exec busybox-5b5d89c9d6-tp5vk -- sh -c "ping -c 1 192.169.0.1"
--- PASS: TestMultiNode/serial/PingHostFrom2Pods (0.90s)

                                                
                                    
x
+
TestMultiNode/serial/AddNode (159.04s)

                                                
                                                
=== RUN   TestMultiNode/serial/AddNode
multinode_test.go:111: (dbg) Run:  out/minikube-darwin-amd64 node add -p multinode-464000 -v 3 --alsologtostderr
E0213 15:10:27.581196    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
E0213 15:10:27.648314    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
E0213 15:11:49.500411    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
multinode_test.go:111: (dbg) Done: out/minikube-darwin-amd64 node add -p multinode-464000 -v 3 --alsologtostderr: (2m38.723704441s)
multinode_test.go:117: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 status --alsologtostderr
--- PASS: TestMultiNode/serial/AddNode (159.04s)

                                                
                                    
x
+
TestMultiNode/serial/MultiNodeLabels (0.08s)

                                                
                                                
=== RUN   TestMultiNode/serial/MultiNodeLabels
multinode_test.go:211: (dbg) Run:  kubectl --context multinode-464000 get nodes -o "jsonpath=[{range .items[*]}{.metadata.labels},{end}]"
--- PASS: TestMultiNode/serial/MultiNodeLabels (0.08s)

                                                
                                    
x
+
TestMultiNode/serial/ProfileList (0.32s)

                                                
                                                
=== RUN   TestMultiNode/serial/ProfileList
multinode_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestMultiNode/serial/ProfileList (0.32s)

                                                
                                    
x
+
TestMultiNode/serial/CopyFile (5.34s)

                                                
                                                
=== RUN   TestMultiNode/serial/CopyFile
multinode_test.go:174: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 status --output json --alsologtostderr
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 cp testdata/cp-test.txt multinode-464000:/home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 ssh -n multinode-464000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 cp multinode-464000:/home/docker/cp-test.txt /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestMultiNodeserialCopyFile489608582/001/cp-test_multinode-464000.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 ssh -n multinode-464000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 cp multinode-464000:/home/docker/cp-test.txt multinode-464000-m02:/home/docker/cp-test_multinode-464000_multinode-464000-m02.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 ssh -n multinode-464000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 ssh -n multinode-464000-m02 "sudo cat /home/docker/cp-test_multinode-464000_multinode-464000-m02.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 cp multinode-464000:/home/docker/cp-test.txt multinode-464000-m03:/home/docker/cp-test_multinode-464000_multinode-464000-m03.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 ssh -n multinode-464000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 ssh -n multinode-464000-m03 "sudo cat /home/docker/cp-test_multinode-464000_multinode-464000-m03.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 cp testdata/cp-test.txt multinode-464000-m02:/home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 ssh -n multinode-464000-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 cp multinode-464000-m02:/home/docker/cp-test.txt /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestMultiNodeserialCopyFile489608582/001/cp-test_multinode-464000-m02.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 ssh -n multinode-464000-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 cp multinode-464000-m02:/home/docker/cp-test.txt multinode-464000:/home/docker/cp-test_multinode-464000-m02_multinode-464000.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 ssh -n multinode-464000-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 ssh -n multinode-464000 "sudo cat /home/docker/cp-test_multinode-464000-m02_multinode-464000.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 cp multinode-464000-m02:/home/docker/cp-test.txt multinode-464000-m03:/home/docker/cp-test_multinode-464000-m02_multinode-464000-m03.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 ssh -n multinode-464000-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 ssh -n multinode-464000-m03 "sudo cat /home/docker/cp-test_multinode-464000-m02_multinode-464000-m03.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 cp testdata/cp-test.txt multinode-464000-m03:/home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 ssh -n multinode-464000-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 cp multinode-464000-m03:/home/docker/cp-test.txt /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestMultiNodeserialCopyFile489608582/001/cp-test_multinode-464000-m03.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 ssh -n multinode-464000-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 cp multinode-464000-m03:/home/docker/cp-test.txt multinode-464000:/home/docker/cp-test_multinode-464000-m03_multinode-464000.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 ssh -n multinode-464000-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 ssh -n multinode-464000 "sudo cat /home/docker/cp-test_multinode-464000-m03_multinode-464000.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 cp multinode-464000-m03:/home/docker/cp-test.txt multinode-464000-m02:/home/docker/cp-test_multinode-464000-m03_multinode-464000-m02.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 ssh -n multinode-464000-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 ssh -n multinode-464000-m02 "sudo cat /home/docker/cp-test_multinode-464000-m03_multinode-464000-m02.txt"
--- PASS: TestMultiNode/serial/CopyFile (5.34s)

                                                
                                    
x
+
TestMultiNode/serial/StopNode (2.69s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopNode
multinode_test.go:238: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 node stop m03
multinode_test.go:238: (dbg) Done: out/minikube-darwin-amd64 -p multinode-464000 node stop m03: (2.189767727s)
multinode_test.go:244: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 status
multinode_test.go:244: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-464000 status: exit status 7 (247.354904ms)

                                                
                                                
-- stdout --
	multinode-464000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-464000-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-464000-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:251: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 status --alsologtostderr
multinode_test.go:251: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-464000 status --alsologtostderr: exit status 7 (247.336869ms)

                                                
                                                
-- stdout --
	multinode-464000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-464000-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-464000-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0213 15:12:52.454272    6257 out.go:291] Setting OutFile to fd 1 ...
	I0213 15:12:52.454481    6257 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0213 15:12:52.454487    6257 out.go:304] Setting ErrFile to fd 2...
	I0213 15:12:52.454491    6257 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0213 15:12:52.454679    6257 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18169-2790/.minikube/bin
	I0213 15:12:52.454879    6257 out.go:298] Setting JSON to false
	I0213 15:12:52.454904    6257 mustload.go:65] Loading cluster: multinode-464000
	I0213 15:12:52.454936    6257 notify.go:220] Checking for updates...
	I0213 15:12:52.455251    6257 config.go:182] Loaded profile config "multinode-464000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
	I0213 15:12:52.455263    6257 status.go:255] checking status of multinode-464000 ...
	I0213 15:12:52.455665    6257 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 15:12:52.455718    6257 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 15:12:52.464430    6257 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52276
	I0213 15:12:52.464852    6257 main.go:141] libmachine: () Calling .GetVersion
	I0213 15:12:52.465274    6257 main.go:141] libmachine: Using API Version  1
	I0213 15:12:52.465290    6257 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 15:12:52.465497    6257 main.go:141] libmachine: () Calling .GetMachineName
	I0213 15:12:52.465599    6257 main.go:141] libmachine: (multinode-464000) Calling .GetState
	I0213 15:12:52.465686    6257 main.go:141] libmachine: (multinode-464000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 15:12:52.465748    6257 main.go:141] libmachine: (multinode-464000) DBG | hyperkit pid from json: 5801
	I0213 15:12:52.466974    6257 status.go:330] multinode-464000 host status = "Running" (err=<nil>)
	I0213 15:12:52.466991    6257 host.go:66] Checking if "multinode-464000" exists ...
	I0213 15:12:52.467216    6257 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 15:12:52.467241    6257 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 15:12:52.475243    6257 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52278
	I0213 15:12:52.475586    6257 main.go:141] libmachine: () Calling .GetVersion
	I0213 15:12:52.475906    6257 main.go:141] libmachine: Using API Version  1
	I0213 15:12:52.475917    6257 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 15:12:52.476104    6257 main.go:141] libmachine: () Calling .GetMachineName
	I0213 15:12:52.476203    6257 main.go:141] libmachine: (multinode-464000) Calling .GetIP
	I0213 15:12:52.476292    6257 host.go:66] Checking if "multinode-464000" exists ...
	I0213 15:12:52.476532    6257 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 15:12:52.476552    6257 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 15:12:52.484529    6257 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52280
	I0213 15:12:52.484879    6257 main.go:141] libmachine: () Calling .GetVersion
	I0213 15:12:52.485218    6257 main.go:141] libmachine: Using API Version  1
	I0213 15:12:52.485229    6257 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 15:12:52.485436    6257 main.go:141] libmachine: () Calling .GetMachineName
	I0213 15:12:52.485536    6257 main.go:141] libmachine: (multinode-464000) Calling .DriverName
	I0213 15:12:52.485679    6257 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0213 15:12:52.485704    6257 main.go:141] libmachine: (multinode-464000) Calling .GetSSHHostname
	I0213 15:12:52.485785    6257 main.go:141] libmachine: (multinode-464000) Calling .GetSSHPort
	I0213 15:12:52.485866    6257 main.go:141] libmachine: (multinode-464000) Calling .GetSSHKeyPath
	I0213 15:12:52.485953    6257 main.go:141] libmachine: (multinode-464000) Calling .GetSSHUsername
	I0213 15:12:52.486046    6257 sshutil.go:53] new ssh client: &{IP:192.169.0.13 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/multinode-464000/id_rsa Username:docker}
	I0213 15:12:52.528366    6257 ssh_runner.go:195] Run: systemctl --version
	I0213 15:12:52.532013    6257 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0213 15:12:52.541076    6257 kubeconfig.go:92] found "multinode-464000" server: "https://192.169.0.13:8443"
	I0213 15:12:52.541097    6257 api_server.go:166] Checking apiserver status ...
	I0213 15:12:52.541136    6257 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0213 15:12:52.550022    6257 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1879/cgroup
	I0213 15:12:52.556378    6257 api_server.go:182] apiserver freezer: "9:freezer:/kubepods/burstable/podf1230d4b67bfe1a2d63bac63d89caabc/8d73e8a0493f74bf425470b07b717057891115d683005eb17d563d9023686740"
	I0213 15:12:52.556422    6257 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/kubepods/burstable/podf1230d4b67bfe1a2d63bac63d89caabc/8d73e8a0493f74bf425470b07b717057891115d683005eb17d563d9023686740/freezer.state
	I0213 15:12:52.562735    6257 api_server.go:204] freezer state: "THAWED"
	I0213 15:12:52.562747    6257 api_server.go:253] Checking apiserver healthz at https://192.169.0.13:8443/healthz ...
	I0213 15:12:52.566182    6257 api_server.go:279] https://192.169.0.13:8443/healthz returned 200:
	ok
	I0213 15:12:52.566195    6257 status.go:421] multinode-464000 apiserver status = Running (err=<nil>)
	I0213 15:12:52.566209    6257 status.go:257] multinode-464000 status: &{Name:multinode-464000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0213 15:12:52.566226    6257 status.go:255] checking status of multinode-464000-m02 ...
	I0213 15:12:52.566467    6257 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 15:12:52.566490    6257 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 15:12:52.574531    6257 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52284
	I0213 15:12:52.574883    6257 main.go:141] libmachine: () Calling .GetVersion
	I0213 15:12:52.575231    6257 main.go:141] libmachine: Using API Version  1
	I0213 15:12:52.575245    6257 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 15:12:52.575477    6257 main.go:141] libmachine: () Calling .GetMachineName
	I0213 15:12:52.575580    6257 main.go:141] libmachine: (multinode-464000-m02) Calling .GetState
	I0213 15:12:52.575667    6257 main.go:141] libmachine: (multinode-464000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 15:12:52.575747    6257 main.go:141] libmachine: (multinode-464000-m02) DBG | hyperkit pid from json: 5850
	I0213 15:12:52.576996    6257 status.go:330] multinode-464000-m02 host status = "Running" (err=<nil>)
	I0213 15:12:52.577005    6257 host.go:66] Checking if "multinode-464000-m02" exists ...
	I0213 15:12:52.577243    6257 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 15:12:52.577272    6257 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 15:12:52.585213    6257 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52286
	I0213 15:12:52.585591    6257 main.go:141] libmachine: () Calling .GetVersion
	I0213 15:12:52.585899    6257 main.go:141] libmachine: Using API Version  1
	I0213 15:12:52.585909    6257 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 15:12:52.586106    6257 main.go:141] libmachine: () Calling .GetMachineName
	I0213 15:12:52.586202    6257 main.go:141] libmachine: (multinode-464000-m02) Calling .GetIP
	I0213 15:12:52.586293    6257 host.go:66] Checking if "multinode-464000-m02" exists ...
	I0213 15:12:52.586528    6257 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 15:12:52.586559    6257 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 15:12:52.594389    6257 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52288
	I0213 15:12:52.594741    6257 main.go:141] libmachine: () Calling .GetVersion
	I0213 15:12:52.595110    6257 main.go:141] libmachine: Using API Version  1
	I0213 15:12:52.595132    6257 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 15:12:52.595344    6257 main.go:141] libmachine: () Calling .GetMachineName
	I0213 15:12:52.595443    6257 main.go:141] libmachine: (multinode-464000-m02) Calling .DriverName
	I0213 15:12:52.595561    6257 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0213 15:12:52.595572    6257 main.go:141] libmachine: (multinode-464000-m02) Calling .GetSSHHostname
	I0213 15:12:52.595657    6257 main.go:141] libmachine: (multinode-464000-m02) Calling .GetSSHPort
	I0213 15:12:52.595742    6257 main.go:141] libmachine: (multinode-464000-m02) Calling .GetSSHKeyPath
	I0213 15:12:52.595825    6257 main.go:141] libmachine: (multinode-464000-m02) Calling .GetSSHUsername
	I0213 15:12:52.595900    6257 sshutil.go:53] new ssh client: &{IP:192.169.0.14 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18169-2790/.minikube/machines/multinode-464000-m02/id_rsa Username:docker}
	I0213 15:12:52.626950    6257 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0213 15:12:52.635659    6257 status.go:257] multinode-464000-m02 status: &{Name:multinode-464000-m02 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}
	I0213 15:12:52.635676    6257 status.go:255] checking status of multinode-464000-m03 ...
	I0213 15:12:52.635937    6257 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 15:12:52.635960    6257 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 15:12:52.643864    6257 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52291
	I0213 15:12:52.644234    6257 main.go:141] libmachine: () Calling .GetVersion
	I0213 15:12:52.644563    6257 main.go:141] libmachine: Using API Version  1
	I0213 15:12:52.644582    6257 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 15:12:52.644799    6257 main.go:141] libmachine: () Calling .GetMachineName
	I0213 15:12:52.644914    6257 main.go:141] libmachine: (multinode-464000-m03) Calling .GetState
	I0213 15:12:52.645004    6257 main.go:141] libmachine: (multinode-464000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 15:12:52.645070    6257 main.go:141] libmachine: (multinode-464000-m03) DBG | hyperkit pid from json: 5945
	I0213 15:12:52.646250    6257 main.go:141] libmachine: (multinode-464000-m03) DBG | hyperkit pid 5945 missing from process table
	I0213 15:12:52.646280    6257 status.go:330] multinode-464000-m03 host status = "Stopped" (err=<nil>)
	I0213 15:12:52.646286    6257 status.go:343] host is not running, skipping remaining checks
	I0213 15:12:52.646291    6257 status.go:257] multinode-464000-m03 status: &{Name:multinode-464000-m03 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopNode (2.69s)

                                                
                                    
x
+
TestMultiNode/serial/StartAfterStop (27.03s)

                                                
                                                
=== RUN   TestMultiNode/serial/StartAfterStop
multinode_test.go:282: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 node start m03 --alsologtostderr
multinode_test.go:282: (dbg) Done: out/minikube-darwin-amd64 -p multinode-464000 node start m03 --alsologtostderr: (26.667171486s)
multinode_test.go:289: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 status
multinode_test.go:303: (dbg) Run:  kubectl get nodes
--- PASS: TestMultiNode/serial/StartAfterStop (27.03s)

                                                
                                    
x
+
TestMultiNode/serial/RestartKeepsNodes (127.08s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartKeepsNodes
multinode_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 node list -p multinode-464000
multinode_test.go:318: (dbg) Run:  out/minikube-darwin-amd64 stop -p multinode-464000
multinode_test.go:318: (dbg) Done: out/minikube-darwin-amd64 stop -p multinode-464000: (18.376555822s)
multinode_test.go:323: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-464000 --wait=true -v=8 --alsologtostderr
E0213 15:14:03.933198    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
E0213 15:14:05.649553    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
E0213 15:14:33.337391    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
E0213 15:14:59.955293    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
multinode_test.go:323: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-464000 --wait=true -v=8 --alsologtostderr: (1m48.590641887s)
multinode_test.go:328: (dbg) Run:  out/minikube-darwin-amd64 node list -p multinode-464000
--- PASS: TestMultiNode/serial/RestartKeepsNodes (127.08s)

                                                
                                    
x
+
TestMultiNode/serial/DeleteNode (3.07s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeleteNode
multinode_test.go:422: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 node delete m03
E0213 15:15:26.979446    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
multinode_test.go:422: (dbg) Done: out/minikube-darwin-amd64 -p multinode-464000 node delete m03: (2.73027219s)
multinode_test.go:428: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 status --alsologtostderr
multinode_test.go:452: (dbg) Run:  kubectl get nodes
multinode_test.go:460: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/DeleteNode (3.07s)

                                                
                                    
x
+
TestMultiNode/serial/StopMultiNode (16.49s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopMultiNode
multinode_test.go:342: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 stop
multinode_test.go:342: (dbg) Done: out/minikube-darwin-amd64 -p multinode-464000 stop: (16.334784749s)
multinode_test.go:348: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 status
multinode_test.go:348: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-464000 status: exit status 7 (77.438999ms)

                                                
                                                
-- stdout --
	multinode-464000
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-464000-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:355: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 status --alsologtostderr
multinode_test.go:355: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-464000 status --alsologtostderr: exit status 7 (76.885028ms)

                                                
                                                
-- stdout --
	multinode-464000
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-464000-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0213 15:15:46.293028    6452 out.go:291] Setting OutFile to fd 1 ...
	I0213 15:15:46.293208    6452 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0213 15:15:46.293213    6452 out.go:304] Setting ErrFile to fd 2...
	I0213 15:15:46.293218    6452 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0213 15:15:46.293406    6452 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18169-2790/.minikube/bin
	I0213 15:15:46.293589    6452 out.go:298] Setting JSON to false
	I0213 15:15:46.293612    6452 mustload.go:65] Loading cluster: multinode-464000
	I0213 15:15:46.293656    6452 notify.go:220] Checking for updates...
	I0213 15:15:46.293923    6452 config.go:182] Loaded profile config "multinode-464000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
	I0213 15:15:46.293935    6452 status.go:255] checking status of multinode-464000 ...
	I0213 15:15:46.294340    6452 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 15:15:46.294373    6452 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 15:15:46.302581    6452 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52473
	I0213 15:15:46.302944    6452 main.go:141] libmachine: () Calling .GetVersion
	I0213 15:15:46.303370    6452 main.go:141] libmachine: Using API Version  1
	I0213 15:15:46.303379    6452 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 15:15:46.303574    6452 main.go:141] libmachine: () Calling .GetMachineName
	I0213 15:15:46.303680    6452 main.go:141] libmachine: (multinode-464000) Calling .GetState
	I0213 15:15:46.303757    6452 main.go:141] libmachine: (multinode-464000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 15:15:46.303825    6452 main.go:141] libmachine: (multinode-464000) DBG | hyperkit pid from json: 6336
	I0213 15:15:46.304766    6452 main.go:141] libmachine: (multinode-464000) DBG | hyperkit pid 6336 missing from process table
	I0213 15:15:46.304813    6452 status.go:330] multinode-464000 host status = "Stopped" (err=<nil>)
	I0213 15:15:46.304825    6452 status.go:343] host is not running, skipping remaining checks
	I0213 15:15:46.304830    6452 status.go:257] multinode-464000 status: &{Name:multinode-464000 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0213 15:15:46.304847    6452 status.go:255] checking status of multinode-464000-m02 ...
	I0213 15:15:46.305091    6452 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0213 15:15:46.305113    6452 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0213 15:15:46.312817    6452 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52475
	I0213 15:15:46.313154    6452 main.go:141] libmachine: () Calling .GetVersion
	I0213 15:15:46.313466    6452 main.go:141] libmachine: Using API Version  1
	I0213 15:15:46.313484    6452 main.go:141] libmachine: () Calling .SetConfigRaw
	I0213 15:15:46.313703    6452 main.go:141] libmachine: () Calling .GetMachineName
	I0213 15:15:46.313807    6452 main.go:141] libmachine: (multinode-464000-m02) Calling .GetState
	I0213 15:15:46.313891    6452 main.go:141] libmachine: (multinode-464000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0213 15:15:46.313952    6452 main.go:141] libmachine: (multinode-464000-m02) DBG | hyperkit pid from json: 6370
	I0213 15:15:46.314885    6452 main.go:141] libmachine: (multinode-464000-m02) DBG | hyperkit pid 6370 missing from process table
	I0213 15:15:46.314926    6452 status.go:330] multinode-464000-m02 host status = "Stopped" (err=<nil>)
	I0213 15:15:46.314937    6452 status.go:343] host is not running, skipping remaining checks
	I0213 15:15:46.314943    6452 status.go:257] multinode-464000-m02 status: &{Name:multinode-464000-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopMultiNode (16.49s)

                                                
                                    
x
+
TestMultiNode/serial/RestartMultiNode (80.55s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartMultiNode
multinode_test.go:382: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-464000 --wait=true -v=8 --alsologtostderr --driver=hyperkit 
multinode_test.go:382: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-464000 --wait=true -v=8 --alsologtostderr --driver=hyperkit : (1m20.218602535s)
multinode_test.go:388: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-464000 status --alsologtostderr
multinode_test.go:402: (dbg) Run:  kubectl get nodes
multinode_test.go:410: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/RestartMultiNode (80.55s)

                                                
                                    
x
+
TestMultiNode/serial/ValidateNameConflict (40.52s)

                                                
                                                
=== RUN   TestMultiNode/serial/ValidateNameConflict
multinode_test.go:471: (dbg) Run:  out/minikube-darwin-amd64 node list -p multinode-464000
multinode_test.go:480: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-464000-m02 --driver=hyperkit 
multinode_test.go:480: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p multinode-464000-m02 --driver=hyperkit : exit status 14 (479.313477ms)

                                                
                                                
-- stdout --
	* [multinode-464000-m02] minikube v1.32.0 on Darwin 14.3.1
	  - MINIKUBE_LOCATION=18169
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18169-2790/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18169-2790/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! Profile name 'multinode-464000-m02' is duplicated with machine name 'multinode-464000-m02' in profile 'multinode-464000'
	X Exiting due to MK_USAGE: Profile name should be unique

                                                
                                                
** /stderr **
multinode_test.go:488: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-464000-m03 --driver=hyperkit 
multinode_test.go:488: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-464000-m03 --driver=hyperkit : (36.287814359s)
multinode_test.go:495: (dbg) Run:  out/minikube-darwin-amd64 node add -p multinode-464000
multinode_test.go:495: (dbg) Non-zero exit: out/minikube-darwin-amd64 node add -p multinode-464000: exit status 80 (279.260619ms)

                                                
                                                
-- stdout --
	* Adding node m03 to cluster multinode-464000
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to GUEST_NODE_ADD: failed to add node: Node multinode-464000-m03 already exists in multinode-464000-m03 profile
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                                                         │
	│    * If the above advice does not help, please let us know:                                                             │
	│      https://github.com/kubernetes/minikube/issues/new/choose                                                           │
	│                                                                                                                         │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.                                │
	│    * Please also attach the following file to the GitHub issue:                                                         │
	│    * - /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube_node_040ea7097fd6ed71e65be9a474587f81f0ccd21d_0.log    │
	│                                                                                                                         │
	╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
multinode_test.go:500: (dbg) Run:  out/minikube-darwin-amd64 delete -p multinode-464000-m03
multinode_test.go:500: (dbg) Done: out/minikube-darwin-amd64 delete -p multinode-464000-m03: (3.420647812s)
--- PASS: TestMultiNode/serial/ValidateNameConflict (40.52s)

                                                
                                    
x
+
TestPreload (180.06s)

                                                
                                                
=== RUN   TestPreload
preload_test.go:44: (dbg) Run:  out/minikube-darwin-amd64 start -p test-preload-687000 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.24.4
E0213 15:19:03.925840    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
E0213 15:19:05.641979    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
preload_test.go:44: (dbg) Done: out/minikube-darwin-amd64 start -p test-preload-687000 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.24.4: (1m33.826343869s)
preload_test.go:52: (dbg) Run:  out/minikube-darwin-amd64 -p test-preload-687000 image pull gcr.io/k8s-minikube/busybox
preload_test.go:52: (dbg) Done: out/minikube-darwin-amd64 -p test-preload-687000 image pull gcr.io/k8s-minikube/busybox: (5.794632225s)
preload_test.go:58: (dbg) Run:  out/minikube-darwin-amd64 stop -p test-preload-687000
preload_test.go:58: (dbg) Done: out/minikube-darwin-amd64 stop -p test-preload-687000: (8.241734882s)
preload_test.go:66: (dbg) Run:  out/minikube-darwin-amd64 start -p test-preload-687000 --memory=2200 --alsologtostderr -v=1 --wait=true --driver=hyperkit 
E0213 15:19:59.948221    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
preload_test.go:66: (dbg) Done: out/minikube-darwin-amd64 start -p test-preload-687000 --memory=2200 --alsologtostderr -v=1 --wait=true --driver=hyperkit : (1m6.777963964s)
preload_test.go:71: (dbg) Run:  out/minikube-darwin-amd64 -p test-preload-687000 image list
helpers_test.go:175: Cleaning up "test-preload-687000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p test-preload-687000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p test-preload-687000: (5.268505437s)
--- PASS: TestPreload (180.06s)

                                                
                                    
x
+
TestScheduledStopUnix (106s)

                                                
                                                
=== RUN   TestScheduledStopUnix
scheduled_stop_test.go:128: (dbg) Run:  out/minikube-darwin-amd64 start -p scheduled-stop-290000 --memory=2048 --driver=hyperkit 
E0213 15:21:22.995171    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
scheduled_stop_test.go:128: (dbg) Done: out/minikube-darwin-amd64 start -p scheduled-stop-290000 --memory=2048 --driver=hyperkit : (34.481223648s)
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-290000 --schedule 5m
scheduled_stop_test.go:191: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.TimeToStop}} -p scheduled-stop-290000 -n scheduled-stop-290000
scheduled_stop_test.go:169: signal error was:  <nil>
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-290000 --schedule 15s
scheduled_stop_test.go:169: signal error was:  os: process already finished
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-290000 --cancel-scheduled
scheduled_stop_test.go:176: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p scheduled-stop-290000 -n scheduled-stop-290000
scheduled_stop_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 status -p scheduled-stop-290000
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-290000 --schedule 15s
scheduled_stop_test.go:169: signal error was:  os: process already finished
scheduled_stop_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 status -p scheduled-stop-290000
scheduled_stop_test.go:205: (dbg) Non-zero exit: out/minikube-darwin-amd64 status -p scheduled-stop-290000: exit status 7 (67.047474ms)

                                                
                                                
-- stdout --
	scheduled-stop-290000
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	

                                                
                                                
-- /stdout --
scheduled_stop_test.go:176: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p scheduled-stop-290000 -n scheduled-stop-290000
scheduled_stop_test.go:176: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p scheduled-stop-290000 -n scheduled-stop-290000: exit status 7 (67.081783ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
scheduled_stop_test.go:176: status error: exit status 7 (may be ok)
helpers_test.go:175: Cleaning up "scheduled-stop-290000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p scheduled-stop-290000
--- PASS: TestScheduledStopUnix (106.00s)

                                                
                                    
x
+
TestSkaffold (128.04s)

                                                
                                                
=== RUN   TestSkaffold
skaffold_test.go:59: (dbg) Run:  /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/skaffold.exe924372793 version
skaffold_test.go:59: (dbg) Done: /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/skaffold.exe924372793 version: (1.695670188s)
skaffold_test.go:63: skaffold version: v2.10.0
skaffold_test.go:66: (dbg) Run:  out/minikube-darwin-amd64 start -p skaffold-220000 --memory=2600 --driver=hyperkit 
skaffold_test.go:66: (dbg) Done: out/minikube-darwin-amd64 start -p skaffold-220000 --memory=2600 --driver=hyperkit : (34.611116791s)
skaffold_test.go:86: copying out/minikube-darwin-amd64 to /Users/jenkins/workspace/out/minikube
skaffold_test.go:105: (dbg) Run:  /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/skaffold.exe924372793 run --minikube-profile skaffold-220000 --kube-context skaffold-220000 --status-check=true --port-forward=false --interactive=false
E0213 15:24:03.918964    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
E0213 15:24:05.636209    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
skaffold_test.go:105: (dbg) Done: /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/skaffold.exe924372793 run --minikube-profile skaffold-220000 --kube-context skaffold-220000 --status-check=true --port-forward=false --interactive=false: (1m8.473659451s)
skaffold_test.go:111: (dbg) TestSkaffold: waiting 1m0s for pods matching "app=leeroy-app" in namespace "default" ...
helpers_test.go:344: "leeroy-app-77fdbbbd66-sjr5x" [babd9835-4174-47ad-9751-4d9279051b25] Running
skaffold_test.go:111: (dbg) TestSkaffold: app=leeroy-app healthy within 6.003939692s
skaffold_test.go:114: (dbg) TestSkaffold: waiting 1m0s for pods matching "app=leeroy-web" in namespace "default" ...
helpers_test.go:344: "leeroy-web-5456fdd756-4d6ds" [0d91e9bb-d2bf-455e-8daa-2704ce1c79bd] Running
skaffold_test.go:114: (dbg) TestSkaffold: app=leeroy-web healthy within 5.004064163s
helpers_test.go:175: Cleaning up "skaffold-220000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p skaffold-220000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p skaffold-220000: (5.263121502s)
--- PASS: TestSkaffold (128.04s)

                                                
                                    
x
+
TestRunningBinaryUpgrade (113.44s)

                                                
                                                
=== RUN   TestRunningBinaryUpgrade
=== PAUSE TestRunningBinaryUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestRunningBinaryUpgrade
version_upgrade_test.go:120: (dbg) Run:  /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube-v1.26.0.2707078458 start -p running-upgrade-427000 --memory=2200 --vm-driver=hyperkit 
E0213 15:29:03.897754    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
E0213 15:29:05.614928    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
E0213 15:29:35.843294    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
E0213 15:29:35.849033    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
E0213 15:29:35.859207    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
E0213 15:29:35.879488    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
E0213 15:29:35.920297    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
E0213 15:29:36.000703    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
E0213 15:29:36.162336    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
E0213 15:29:36.482836    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
E0213 15:29:37.123958    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
E0213 15:29:38.404717    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
E0213 15:29:40.964825    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
E0213 15:29:46.085328    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
E0213 15:29:56.325206    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
version_upgrade_test.go:120: (dbg) Done: /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube-v1.26.0.2707078458 start -p running-upgrade-427000 --memory=2200 --vm-driver=hyperkit : (1m17.865558852s)
version_upgrade_test.go:130: (dbg) Run:  out/minikube-darwin-amd64 start -p running-upgrade-427000 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit 
E0213 15:29:59.919372    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
E0213 15:30:16.804813    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
version_upgrade_test.go:130: (dbg) Done: out/minikube-darwin-amd64 start -p running-upgrade-427000 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit : (24.891126159s)
helpers_test.go:175: Cleaning up "running-upgrade-427000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p running-upgrade-427000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p running-upgrade-427000: (5.314896386s)
--- PASS: TestRunningBinaryUpgrade (113.44s)

                                                
                                    
x
+
TestKubernetesUpgrade (163.09s)

                                                
                                                
=== RUN   TestKubernetesUpgrade
=== PAUSE TestKubernetesUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestKubernetesUpgrade
version_upgrade_test.go:222: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-528000 --memory=2200 --kubernetes-version=v1.16.0 --alsologtostderr -v=1 --driver=hyperkit 
version_upgrade_test.go:222: (dbg) Done: out/minikube-darwin-amd64 start -p kubernetes-upgrade-528000 --memory=2200 --kubernetes-version=v1.16.0 --alsologtostderr -v=1 --driver=hyperkit : (1m13.911484598s)
version_upgrade_test.go:227: (dbg) Run:  out/minikube-darwin-amd64 stop -p kubernetes-upgrade-528000
version_upgrade_test.go:227: (dbg) Done: out/minikube-darwin-amd64 stop -p kubernetes-upgrade-528000: (8.223582553s)
version_upgrade_test.go:232: (dbg) Run:  out/minikube-darwin-amd64 -p kubernetes-upgrade-528000 status --format={{.Host}}
version_upgrade_test.go:232: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p kubernetes-upgrade-528000 status --format={{.Host}}: exit status 7 (69.167851ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
version_upgrade_test.go:234: status error: exit status 7 (may be ok)
version_upgrade_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-528000 --memory=2200 --kubernetes-version=v1.29.0-rc.2 --alsologtostderr -v=1 --driver=hyperkit 
version_upgrade_test.go:243: (dbg) Done: out/minikube-darwin-amd64 start -p kubernetes-upgrade-528000 --memory=2200 --kubernetes-version=v1.29.0-rc.2 --alsologtostderr -v=1 --driver=hyperkit : (32.687351878s)
version_upgrade_test.go:248: (dbg) Run:  kubectl --context kubernetes-upgrade-528000 version --output=json
version_upgrade_test.go:267: Attempting to downgrade Kubernetes (should fail)
version_upgrade_test.go:269: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-528000 --memory=2200 --kubernetes-version=v1.16.0 --driver=hyperkit 
version_upgrade_test.go:269: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p kubernetes-upgrade-528000 --memory=2200 --kubernetes-version=v1.16.0 --driver=hyperkit : exit status 106 (446.57844ms)

                                                
                                                
-- stdout --
	* [kubernetes-upgrade-528000] minikube v1.32.0 on Darwin 14.3.1
	  - MINIKUBE_LOCATION=18169
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18169-2790/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18169-2790/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to K8S_DOWNGRADE_UNSUPPORTED: Unable to safely downgrade existing Kubernetes v1.29.0-rc.2 cluster to v1.16.0
	* Suggestion: 
	
	    1) Recreate the cluster with Kubernetes 1.16.0, by running:
	    
	    minikube delete -p kubernetes-upgrade-528000
	    minikube start -p kubernetes-upgrade-528000 --kubernetes-version=v1.16.0
	    
	    2) Create a second cluster with Kubernetes 1.16.0, by running:
	    
	    minikube start -p kubernetes-upgrade-5280002 --kubernetes-version=v1.16.0
	    
	    3) Use the existing cluster at version Kubernetes 1.29.0-rc.2, by running:
	    
	    minikube start -p kubernetes-upgrade-528000 --kubernetes-version=v1.29.0-rc.2
	    

                                                
                                                
** /stderr **
version_upgrade_test.go:273: Attempting restart after unsuccessful downgrade
version_upgrade_test.go:275: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-528000 --memory=2200 --kubernetes-version=v1.29.0-rc.2 --alsologtostderr -v=1 --driver=hyperkit 
version_upgrade_test.go:275: (dbg) Done: out/minikube-darwin-amd64 start -p kubernetes-upgrade-528000 --memory=2200 --kubernetes-version=v1.29.0-rc.2 --alsologtostderr -v=1 --driver=hyperkit : (42.421628007s)
helpers_test.go:175: Cleaning up "kubernetes-upgrade-528000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p kubernetes-upgrade-528000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p kubernetes-upgrade-528000: (5.281750632s)
--- PASS: TestKubernetesUpgrade (163.09s)

                                                
                                    
x
+
TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current (4.37s)

                                                
                                                
=== RUN   TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current
E0213 15:24:59.941757    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
* minikube v1.32.0 on darwin
- MINIKUBE_LOCATION=18169
- KUBECONFIG=/Users/jenkins/minikube-integration/18169-2790/kubeconfig
- MINIKUBE_BIN=out/minikube-darwin-amd64
- MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
- MINIKUBE_FORCE_SYSTEMD=
- MINIKUBE_HOME=/var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current1708441091/001
* Using the hyperkit driver based on user configuration
* The 'hyperkit' driver requires elevated permissions. The following commands will be executed:

                                                
                                                
$ sudo chown root:wheel /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current1708441091/001/.minikube/bin/docker-machine-driver-hyperkit 
$ sudo chmod u+s /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current1708441091/001/.minikube/bin/docker-machine-driver-hyperkit 

                                                
                                                

                                                
                                                
! Unable to update hyperkit driver: [sudo chown root:wheel /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current1708441091/001/.minikube/bin/docker-machine-driver-hyperkit] requires a password, and --interactive=false
* Starting control plane node minikube in cluster minikube
* Download complete!
--- PASS: TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current (4.37s)

                                                
                                    
x
+
TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current (7.39s)

                                                
                                                
=== RUN   TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current
* minikube v1.32.0 on darwin
- MINIKUBE_LOCATION=18169
- KUBECONFIG=/Users/jenkins/minikube-integration/18169-2790/kubeconfig
- MINIKUBE_BIN=out/minikube-darwin-amd64
- MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
- MINIKUBE_FORCE_SYSTEMD=
- MINIKUBE_HOME=/var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current1997672193/001
* Using the hyperkit driver based on user configuration
* Downloading driver docker-machine-driver-hyperkit:
* The 'hyperkit' driver requires elevated permissions. The following commands will be executed:

                                                
                                                
$ sudo chown root:wheel /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current1997672193/001/.minikube/bin/docker-machine-driver-hyperkit 
$ sudo chmod u+s /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current1997672193/001/.minikube/bin/docker-machine-driver-hyperkit 

                                                
                                                

                                                
                                                
! Unable to update hyperkit driver: [sudo chown root:wheel /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current1997672193/001/.minikube/bin/docker-machine-driver-hyperkit] requires a password, and --interactive=false
* Starting control plane node minikube in cluster minikube
* Download complete!
--- PASS: TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current (7.39s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Setup (5s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Setup
--- PASS: TestStoppedBinaryUpgrade/Setup (5.00s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Upgrade (89.06s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Upgrade
version_upgrade_test.go:183: (dbg) Run:  /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube-v1.26.0.4283497380 start -p stopped-upgrade-531000 --memory=2200 --vm-driver=hyperkit 
version_upgrade_test.go:183: (dbg) Done: /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube-v1.26.0.4283497380 start -p stopped-upgrade-531000 --memory=2200 --vm-driver=hyperkit : (43.368464492s)
version_upgrade_test.go:192: (dbg) Run:  /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube-v1.26.0.4283497380 -p stopped-upgrade-531000 stop
version_upgrade_test.go:192: (dbg) Done: /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube-v1.26.0.4283497380 -p stopped-upgrade-531000 stop: (8.225353076s)
version_upgrade_test.go:198: (dbg) Run:  out/minikube-darwin-amd64 start -p stopped-upgrade-531000 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit 
E0213 15:32:06.941960    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
E0213 15:32:19.681145    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
version_upgrade_test.go:198: (dbg) Done: out/minikube-darwin-amd64 start -p stopped-upgrade-531000 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit : (37.463259989s)
--- PASS: TestStoppedBinaryUpgrade/Upgrade (89.06s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/MinikubeLogs (2.99s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/MinikubeLogs
version_upgrade_test.go:206: (dbg) Run:  out/minikube-darwin-amd64 logs -p stopped-upgrade-531000
version_upgrade_test.go:206: (dbg) Done: out/minikube-darwin-amd64 logs -p stopped-upgrade-531000: (2.988512147s)
--- PASS: TestStoppedBinaryUpgrade/MinikubeLogs (2.99s)

                                                
                                    
x
+
TestPause/serial/Start (57.5s)

                                                
                                                
=== RUN   TestPause/serial/Start
pause_test.go:80: (dbg) Run:  out/minikube-darwin-amd64 start -p pause-645000 --memory=2048 --install-addons=false --wait=all --driver=hyperkit 
pause_test.go:80: (dbg) Done: out/minikube-darwin-amd64 start -p pause-645000 --memory=2048 --install-addons=false --wait=all --driver=hyperkit : (57.502392045s)
--- PASS: TestPause/serial/Start (57.50s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoK8sWithVersion (0.5s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoK8sWithVersion
no_kubernetes_test.go:83: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-777000 --no-kubernetes --kubernetes-version=1.20 --driver=hyperkit 
no_kubernetes_test.go:83: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p NoKubernetes-777000 --no-kubernetes --kubernetes-version=1.20 --driver=hyperkit : exit status 14 (502.551266ms)

                                                
                                                
-- stdout --
	* [NoKubernetes-777000] minikube v1.32.0 on Darwin 14.3.1
	  - MINIKUBE_LOCATION=18169
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18169-2790/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18169-2790/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to MK_USAGE: cannot specify --kubernetes-version with --no-kubernetes,
	to unset a global config run:
	
	$ minikube config unset kubernetes-version

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/StartNoK8sWithVersion (0.50s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithK8s (38.33s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithK8s
no_kubernetes_test.go:95: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-777000 --driver=hyperkit 
no_kubernetes_test.go:95: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-777000 --driver=hyperkit : (38.153949314s)
no_kubernetes_test.go:200: (dbg) Run:  out/minikube-darwin-amd64 -p NoKubernetes-777000 status -o json
--- PASS: TestNoKubernetes/serial/StartWithK8s (38.33s)

                                                
                                    
x
+
TestPause/serial/SecondStartNoReconfiguration (38.9s)

                                                
                                                
=== RUN   TestPause/serial/SecondStartNoReconfiguration
pause_test.go:92: (dbg) Run:  out/minikube-darwin-amd64 start -p pause-645000 --alsologtostderr -v=1 --driver=hyperkit 
pause_test.go:92: (dbg) Done: out/minikube-darwin-amd64 start -p pause-645000 --alsologtostderr -v=1 --driver=hyperkit : (38.889463992s)
--- PASS: TestPause/serial/SecondStartNoReconfiguration (38.90s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithStopK8s (7.65s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithStopK8s
no_kubernetes_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-777000 --no-kubernetes --driver=hyperkit 
no_kubernetes_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-777000 --no-kubernetes --driver=hyperkit : (5.088026041s)
no_kubernetes_test.go:200: (dbg) Run:  out/minikube-darwin-amd64 -p NoKubernetes-777000 status -o json
no_kubernetes_test.go:200: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p NoKubernetes-777000 status -o json: exit status 2 (144.978758ms)

                                                
                                                
-- stdout --
	{"Name":"NoKubernetes-777000","Host":"Running","Kubelet":"Stopped","APIServer":"Stopped","Kubeconfig":"Configured","Worker":false}

                                                
                                                
-- /stdout --
no_kubernetes_test.go:124: (dbg) Run:  out/minikube-darwin-amd64 delete -p NoKubernetes-777000
no_kubernetes_test.go:124: (dbg) Done: out/minikube-darwin-amd64 delete -p NoKubernetes-777000: (2.418444243s)
--- PASS: TestNoKubernetes/serial/StartWithStopK8s (7.65s)

                                                
                                    
x
+
TestNoKubernetes/serial/Start (16.26s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Start
no_kubernetes_test.go:136: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-777000 --no-kubernetes --driver=hyperkit 
E0213 15:34:03.887123    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
E0213 15:34:05.603800    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
no_kubernetes_test.go:136: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-777000 --no-kubernetes --driver=hyperkit : (16.258611074s)
--- PASS: TestNoKubernetes/serial/Start (16.26s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunning (0.14s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunning
no_kubernetes_test.go:147: (dbg) Run:  out/minikube-darwin-amd64 ssh -p NoKubernetes-777000 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:147: (dbg) Non-zero exit: out/minikube-darwin-amd64 ssh -p NoKubernetes-777000 "sudo systemctl is-active --quiet service kubelet": exit status 1 (138.948296ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunning (0.14s)

                                                
                                    
x
+
TestNoKubernetes/serial/ProfileList (0.53s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/ProfileList
no_kubernetes_test.go:169: (dbg) Run:  out/minikube-darwin-amd64 profile list
no_kubernetes_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 profile list --output=json
--- PASS: TestNoKubernetes/serial/ProfileList (0.53s)

                                                
                                    
x
+
TestNoKubernetes/serial/Stop (8.26s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Stop
no_kubernetes_test.go:158: (dbg) Run:  out/minikube-darwin-amd64 stop -p NoKubernetes-777000
no_kubernetes_test.go:158: (dbg) Done: out/minikube-darwin-amd64 stop -p NoKubernetes-777000: (8.255082274s)
--- PASS: TestNoKubernetes/serial/Stop (8.26s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoArgs (15.35s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoArgs
no_kubernetes_test.go:191: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-777000 --driver=hyperkit 
no_kubernetes_test.go:191: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-777000 --driver=hyperkit : (15.353580056s)
--- PASS: TestNoKubernetes/serial/StartNoArgs (15.35s)

                                                
                                    
x
+
TestPause/serial/Pause (0.5s)

                                                
                                                
=== RUN   TestPause/serial/Pause
pause_test.go:110: (dbg) Run:  out/minikube-darwin-amd64 pause -p pause-645000 --alsologtostderr -v=5
--- PASS: TestPause/serial/Pause (0.50s)

                                                
                                    
x
+
TestPause/serial/VerifyStatus (0.16s)

                                                
                                                
=== RUN   TestPause/serial/VerifyStatus
status_test.go:76: (dbg) Run:  out/minikube-darwin-amd64 status -p pause-645000 --output=json --layout=cluster
status_test.go:76: (dbg) Non-zero exit: out/minikube-darwin-amd64 status -p pause-645000 --output=json --layout=cluster: exit status 2 (161.438105ms)

                                                
                                                
-- stdout --
	{"Name":"pause-645000","StatusCode":418,"StatusName":"Paused","Step":"Done","StepDetail":"* Paused 12 containers in: kube-system, kubernetes-dashboard, storage-gluster, istio-operator","BinaryVersion":"v1.32.0","Components":{"kubeconfig":{"Name":"kubeconfig","StatusCode":200,"StatusName":"OK"}},"Nodes":[{"Name":"pause-645000","StatusCode":200,"StatusName":"OK","Components":{"apiserver":{"Name":"apiserver","StatusCode":418,"StatusName":"Paused"},"kubelet":{"Name":"kubelet","StatusCode":405,"StatusName":"Stopped"}}}]}

                                                
                                                
-- /stdout --
--- PASS: TestPause/serial/VerifyStatus (0.16s)

                                                
                                    
x
+
TestPause/serial/Unpause (0.5s)

                                                
                                                
=== RUN   TestPause/serial/Unpause
pause_test.go:121: (dbg) Run:  out/minikube-darwin-amd64 unpause -p pause-645000 --alsologtostderr -v=5
--- PASS: TestPause/serial/Unpause (0.50s)

                                                
                                    
x
+
TestPause/serial/PauseAgain (0.56s)

                                                
                                                
=== RUN   TestPause/serial/PauseAgain
pause_test.go:110: (dbg) Run:  out/minikube-darwin-amd64 pause -p pause-645000 --alsologtostderr -v=5
--- PASS: TestPause/serial/PauseAgain (0.56s)

                                                
                                    
x
+
TestPause/serial/DeletePaused (5.27s)

                                                
                                                
=== RUN   TestPause/serial/DeletePaused
pause_test.go:132: (dbg) Run:  out/minikube-darwin-amd64 delete -p pause-645000 --alsologtostderr -v=5
pause_test.go:132: (dbg) Done: out/minikube-darwin-amd64 delete -p pause-645000 --alsologtostderr -v=5: (5.265749124s)
--- PASS: TestPause/serial/DeletePaused (5.27s)

                                                
                                    
x
+
TestPause/serial/VerifyDeletedResources (0.18s)

                                                
                                                
=== RUN   TestPause/serial/VerifyDeletedResources
pause_test.go:142: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestPause/serial/VerifyDeletedResources (0.18s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/Start (50.4s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p auto-599000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --driver=hyperkit 
E0213 15:34:35.833045    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p auto-599000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --driver=hyperkit : (50.399047366s)
--- PASS: TestNetworkPlugins/group/auto/Start (50.40s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.13s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunningSecond
no_kubernetes_test.go:147: (dbg) Run:  out/minikube-darwin-amd64 ssh -p NoKubernetes-777000 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:147: (dbg) Non-zero exit: out/minikube-darwin-amd64 ssh -p NoKubernetes-777000 "sudo systemctl is-active --quiet service kubelet": exit status 1 (128.659018ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/Start (65.49s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p kindnet-599000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=kindnet --driver=hyperkit 
E0213 15:34:59.909418    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
E0213 15:35:03.515766    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p kindnet-599000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=kindnet --driver=hyperkit : (1m5.487341593s)
--- PASS: TestNetworkPlugins/group/kindnet/Start (65.49s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p auto-599000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/auto/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/NetCatPod (14.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context auto-599000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/auto/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-qxf4w" [e971308c-bd8c-4557-bf8b-be3b0cbc2334] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-56589dfd74-qxf4w" [e971308c-bd8c-4557-bf8b-be3b0cbc2334] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/auto/NetCatPod: app=netcat healthy within 14.003645995s
--- PASS: TestNetworkPlugins/group/auto/NetCatPod (14.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/DNS (0.16s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/DNS
net_test.go:175: (dbg) Run:  kubectl --context auto-599000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/auto/DNS (0.16s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/Localhost (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/Localhost
net_test.go:194: (dbg) Run:  kubectl --context auto-599000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/auto/Localhost (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/HairPin (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/HairPin
net_test.go:264: (dbg) Run:  kubectl --context auto-599000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/auto/HairPin (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/ControllerPod (6s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/ControllerPod
net_test.go:120: (dbg) TestNetworkPlugins/group/kindnet/ControllerPod: waiting 10m0s for pods matching "app=kindnet" in namespace "kube-system" ...
helpers_test.go:344: "kindnet-ml8k8" [d53be2ed-b6a9-44eb-8cb9-6369819e380b] Running
net_test.go:120: (dbg) TestNetworkPlugins/group/kindnet/ControllerPod: app=kindnet healthy within 6.003611316s
--- PASS: TestNetworkPlugins/group/kindnet/ControllerPod (6.00s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/KubeletFlags (0.16s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p kindnet-599000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/kindnet/KubeletFlags (0.16s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/NetCatPod (15.2s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context kindnet-599000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/kindnet/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-2mw7g" [16b4aa54-d35b-4513-8c52-c39d51971e19] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-56589dfd74-2mw7g" [16b4aa54-d35b-4513-8c52-c39d51971e19] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/kindnet/NetCatPod: app=netcat healthy within 15.002639581s
--- PASS: TestNetworkPlugins/group/kindnet/NetCatPod (15.20s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/Start (78.75s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p calico-599000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=calico --driver=hyperkit 
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p calico-599000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=calico --driver=hyperkit : (1m18.753739775s)
--- PASS: TestNetworkPlugins/group/calico/Start (78.75s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/DNS (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/DNS
net_test.go:175: (dbg) Run:  kubectl --context kindnet-599000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/kindnet/DNS (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/Localhost (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/Localhost
net_test.go:194: (dbg) Run:  kubectl --context kindnet-599000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/kindnet/Localhost (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/HairPin (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/HairPin
net_test.go:264: (dbg) Run:  kubectl --context kindnet-599000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/kindnet/HairPin (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/Start (58.78s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p custom-flannel-599000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=testdata/kube-flannel.yaml --driver=hyperkit 
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p custom-flannel-599000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=testdata/kube-flannel.yaml --driver=hyperkit : (58.775983212s)
--- PASS: TestNetworkPlugins/group/custom-flannel/Start (58.78s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/ControllerPod (6s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/ControllerPod
net_test.go:120: (dbg) TestNetworkPlugins/group/calico/ControllerPod: waiting 10m0s for pods matching "k8s-app=calico-node" in namespace "kube-system" ...
helpers_test.go:344: "calico-node-rfbq9" [dff5eee8-6cc9-4a37-aef2-60ca91068a41] Running
net_test.go:120: (dbg) TestNetworkPlugins/group/calico/ControllerPod: k8s-app=calico-node healthy within 6.003416493s
--- PASS: TestNetworkPlugins/group/calico/ControllerPod (6.00s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/KubeletFlags (0.17s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p calico-599000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/calico/KubeletFlags (0.17s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/NetCatPod (16.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context calico-599000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/calico/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-lvw5n" [2d3e01e7-1053-4c69-9c7d-15f261e5b271] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-56589dfd74-lvw5n" [2d3e01e7-1053-4c69-9c7d-15f261e5b271] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/calico/NetCatPod: app=netcat healthy within 16.004289643s
--- PASS: TestNetworkPlugins/group/calico/NetCatPod (16.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/KubeletFlags (0.16s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p custom-flannel-599000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/custom-flannel/KubeletFlags (0.16s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/NetCatPod (16.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context custom-flannel-599000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/custom-flannel/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-jf54l" [9edf432a-0f61-4470-a35d-9fe70f725abf] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-56589dfd74-jf54l" [9edf432a-0f61-4470-a35d-9fe70f725abf] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/custom-flannel/NetCatPod: app=netcat healthy within 16.002678941s
--- PASS: TestNetworkPlugins/group/custom-flannel/NetCatPod (16.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/DNS (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/DNS
net_test.go:175: (dbg) Run:  kubectl --context calico-599000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/calico/DNS (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/Localhost (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/Localhost
net_test.go:194: (dbg) Run:  kubectl --context calico-599000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/calico/Localhost (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/HairPin (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/HairPin
net_test.go:264: (dbg) Run:  kubectl --context calico-599000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/calico/HairPin (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/DNS (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/DNS
net_test.go:175: (dbg) Run:  kubectl --context custom-flannel-599000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/custom-flannel/DNS (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/Localhost (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/Localhost
net_test.go:194: (dbg) Run:  kubectl --context custom-flannel-599000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/custom-flannel/Localhost (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/HairPin (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/HairPin
net_test.go:264: (dbg) Run:  kubectl --context custom-flannel-599000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/custom-flannel/HairPin (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/Start (52.84s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p false-599000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=false --driver=hyperkit 
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p false-599000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=false --driver=hyperkit : (52.844571749s)
--- PASS: TestNetworkPlugins/group/false/Start (52.84s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/Start (61.33s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p enable-default-cni-599000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --enable-default-cni=true --driver=hyperkit 
E0213 15:38:02.953071    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p enable-default-cni-599000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --enable-default-cni=true --driver=hyperkit : (1m1.32879947s)
--- PASS: TestNetworkPlugins/group/enable-default-cni/Start (61.33s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/KubeletFlags (0.17s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p false-599000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/false/KubeletFlags (0.17s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/NetCatPod (15.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context false-599000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/false/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-ffl67" [3869a368-008a-45c0-b0f2-e5599cb67fab] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-56589dfd74-ffl67" [3869a368-008a-45c0-b0f2-e5599cb67fab] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/false/NetCatPod: app=netcat healthy within 15.004369066s
--- PASS: TestNetworkPlugins/group/false/NetCatPod (15.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/KubeletFlags (0.16s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p enable-default-cni-599000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/enable-default-cni/KubeletFlags (0.16s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/NetCatPod (15.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context enable-default-cni-599000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/enable-default-cni/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-74858" [2d470656-0e3b-4ecf-9ec4-6a1713f6f262] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
E0213 15:39:03.877236    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
E0213 15:39:05.594090    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
helpers_test.go:344: "netcat-56589dfd74-74858" [2d470656-0e3b-4ecf-9ec4-6a1713f6f262] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/enable-default-cni/NetCatPod: app=netcat healthy within 15.004469551s
--- PASS: TestNetworkPlugins/group/enable-default-cni/NetCatPod (15.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/DNS (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/DNS
net_test.go:175: (dbg) Run:  kubectl --context false-599000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/false/DNS (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/Localhost (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/Localhost
net_test.go:194: (dbg) Run:  kubectl --context false-599000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/false/Localhost (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/HairPin (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/HairPin
net_test.go:264: (dbg) Run:  kubectl --context false-599000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/false/HairPin (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/DNS (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/DNS
net_test.go:175: (dbg) Run:  kubectl --context enable-default-cni-599000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/enable-default-cni/DNS (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/Localhost (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/Localhost
net_test.go:194: (dbg) Run:  kubectl --context enable-default-cni-599000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/enable-default-cni/Localhost (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/HairPin (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/HairPin
net_test.go:264: (dbg) Run:  kubectl --context enable-default-cni-599000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/enable-default-cni/HairPin (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/Start (61.35s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p flannel-599000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=flannel --driver=hyperkit 
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p flannel-599000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=flannel --driver=hyperkit : (1m1.352048757s)
--- PASS: TestNetworkPlugins/group/flannel/Start (61.35s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/Start (93.03s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p bridge-599000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=bridge --driver=hyperkit 
E0213 15:39:35.824457    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
E0213 15:39:59.899590    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p bridge-599000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=bridge --driver=hyperkit : (1m33.034790527s)
--- PASS: TestNetworkPlugins/group/bridge/Start (93.03s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/ControllerPod (6s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/ControllerPod
net_test.go:120: (dbg) TestNetworkPlugins/group/flannel/ControllerPod: waiting 10m0s for pods matching "app=flannel" in namespace "kube-flannel" ...
helpers_test.go:344: "kube-flannel-ds-k57ng" [61b78763-3989-4b77-8931-f3e086d2678e] Running
E0213 15:40:26.371839    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/auto-599000/client.crt: no such file or directory
E0213 15:40:26.376989    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/auto-599000/client.crt: no such file or directory
E0213 15:40:26.387359    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/auto-599000/client.crt: no such file or directory
E0213 15:40:26.408923    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/auto-599000/client.crt: no such file or directory
E0213 15:40:26.449409    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/auto-599000/client.crt: no such file or directory
E0213 15:40:26.530756    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/auto-599000/client.crt: no such file or directory
E0213 15:40:26.691926    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/auto-599000/client.crt: no such file or directory
E0213 15:40:27.013626    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/auto-599000/client.crt: no such file or directory
E0213 15:40:27.655148    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/auto-599000/client.crt: no such file or directory
E0213 15:40:28.936242    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/auto-599000/client.crt: no such file or directory
net_test.go:120: (dbg) TestNetworkPlugins/group/flannel/ControllerPod: app=flannel healthy within 6.002373377s
--- PASS: TestNetworkPlugins/group/flannel/ControllerPod (6.00s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/KubeletFlags (0.17s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p flannel-599000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/flannel/KubeletFlags (0.17s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/NetCatPod (15.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context flannel-599000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/flannel/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-mp9fb" [f6e4de84-96bf-4544-b2f8-fcf3a210fd21] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
E0213 15:40:31.498436    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/auto-599000/client.crt: no such file or directory
E0213 15:40:36.619113    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/auto-599000/client.crt: no such file or directory
helpers_test.go:344: "netcat-56589dfd74-mp9fb" [f6e4de84-96bf-4544-b2f8-fcf3a210fd21] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/flannel/NetCatPod: app=netcat healthy within 15.002955435s
--- PASS: TestNetworkPlugins/group/flannel/NetCatPod (15.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/DNS (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/DNS
net_test.go:175: (dbg) Run:  kubectl --context flannel-599000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/flannel/DNS (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/Localhost (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/Localhost
net_test.go:194: (dbg) Run:  kubectl --context flannel-599000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/flannel/Localhost (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/HairPin (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/HairPin
net_test.go:264: (dbg) Run:  kubectl --context flannel-599000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
E0213 15:40:46.670598    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kindnet-599000/client.crt: no such file or directory
E0213 15:40:46.676572    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kindnet-599000/client.crt: no such file or directory
E0213 15:40:46.687121    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kindnet-599000/client.crt: no such file or directory
E0213 15:40:46.707440    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kindnet-599000/client.crt: no such file or directory
--- PASS: TestNetworkPlugins/group/flannel/HairPin (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/Start (59.26s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p kubenet-599000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --network-plugin=kubenet --driver=hyperkit 
E0213 15:41:07.156532    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kindnet-599000/client.crt: no such file or directory
E0213 15:41:07.339077    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/auto-599000/client.crt: no such file or directory
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p kubenet-599000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --network-plugin=kubenet --driver=hyperkit : (59.255028514s)
--- PASS: TestNetworkPlugins/group/kubenet/Start (59.26s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/KubeletFlags (0.16s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p bridge-599000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/bridge/KubeletFlags (0.16s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/NetCatPod (15.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context bridge-599000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/bridge/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-2mqvw" [8af389b1-4a11-4140-b664-256883753ef7] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-56589dfd74-2mqvw" [8af389b1-4a11-4140-b664-256883753ef7] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/bridge/NetCatPod: app=netcat healthy within 15.004167321s
--- PASS: TestNetworkPlugins/group/bridge/NetCatPod (15.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/DNS (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/DNS
net_test.go:175: (dbg) Run:  kubectl --context bridge-599000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/bridge/DNS (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/Localhost (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/Localhost
net_test.go:194: (dbg) Run:  kubectl --context bridge-599000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/bridge/Localhost (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/HairPin (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/HairPin
net_test.go:264: (dbg) Run:  kubectl --context bridge-599000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/bridge/HairPin (0.11s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/FirstStart (132.6s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p old-k8s-version-481000 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=hyperkit  --kubernetes-version=v1.16.0
E0213 15:41:48.298291    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/auto-599000/client.crt: no such file or directory
start_stop_delete_test.go:186: (dbg) Done: out/minikube-darwin-amd64 start -p old-k8s-version-481000 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=hyperkit  --kubernetes-version=v1.16.0: (2m12.595741016s)
--- PASS: TestStartStop/group/old-k8s-version/serial/FirstStart (132.60s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/KubeletFlags (0.17s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p kubenet-599000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/kubenet/KubeletFlags (0.17s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/NetCatPod (15.17s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context kubenet-599000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/kubenet/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-6hmwf" [38d1204f-211d-4eb1-a541-ec512ba49b21] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
E0213 15:42:08.595844    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kindnet-599000/client.crt: no such file or directory
E0213 15:42:08.641242    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
helpers_test.go:344: "netcat-56589dfd74-6hmwf" [38d1204f-211d-4eb1-a541-ec512ba49b21] Running
E0213 15:42:16.781751    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/calico-599000/client.crt: no such file or directory
E0213 15:42:16.787324    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/calico-599000/client.crt: no such file or directory
E0213 15:42:16.797842    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/calico-599000/client.crt: no such file or directory
E0213 15:42:16.819431    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/calico-599000/client.crt: no such file or directory
E0213 15:42:16.859751    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/calico-599000/client.crt: no such file or directory
E0213 15:42:16.941612    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/calico-599000/client.crt: no such file or directory
E0213 15:42:17.101938    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/calico-599000/client.crt: no such file or directory
E0213 15:42:17.423157    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/calico-599000/client.crt: no such file or directory
E0213 15:42:18.064139    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/calico-599000/client.crt: no such file or directory
E0213 15:42:19.345733    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/calico-599000/client.crt: no such file or directory
net_test.go:163: (dbg) TestNetworkPlugins/group/kubenet/NetCatPod: app=netcat healthy within 15.00586836s
--- PASS: TestNetworkPlugins/group/kubenet/NetCatPod (15.17s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/DNS (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/DNS
net_test.go:175: (dbg) Run:  kubectl --context kubenet-599000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/kubenet/DNS (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/Localhost
net_test.go:194: (dbg) Run:  kubectl --context kubenet-599000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/kubenet/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/HairPin (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/HairPin
net_test.go:264: (dbg) Run:  kubectl --context kubenet-599000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/kubenet/HairPin (0.11s)
E0213 16:25:00.075343    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
E0213 16:25:25.251805    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/flannel-599000/client.crt: no such file or directory
E0213 16:25:26.550573    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/auto-599000/client.crt: no such file or directory
E0213 16:25:46.849974    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kindnet-599000/client.crt: no such file or directory
E0213 16:26:07.890576    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/bridge-599000/client.crt: no such file or directory
E0213 16:26:42.371260    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/no-preload-355000/client.crt: no such file or directory
E0213 16:26:56.960762    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/old-k8s-version-481000/client.crt: no such file or directory
E0213 16:27:04.613396    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kubenet-599000/client.crt: no such file or directory
E0213 16:27:17.007042    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/calico-599000/client.crt: no such file or directory
E0213 16:27:25.228175    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/custom-flannel-599000/client.crt: no such file or directory
E0213 16:28:03.177218    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
E0213 16:28:16.590849    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/default-k8s-diff-port-603000/client.crt: no such file or directory
E0213 16:28:16.596278    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/default-k8s-diff-port-603000/client.crt: no such file or directory
E0213 16:28:16.608475    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/default-k8s-diff-port-603000/client.crt: no such file or directory
E0213 16:28:16.628794    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/default-k8s-diff-port-603000/client.crt: no such file or directory
E0213 16:28:16.671034    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/default-k8s-diff-port-603000/client.crt: no such file or directory
E0213 16:28:16.751759    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/default-k8s-diff-port-603000/client.crt: no such file or directory
E0213 16:28:16.912497    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/default-k8s-diff-port-603000/client.crt: no such file or directory
E0213 16:28:17.233021    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/default-k8s-diff-port-603000/client.crt: no such file or directory

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/FirstStart (61.64s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p no-preload-355000 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.29.0-rc.2
E0213 15:42:45.484621    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/custom-flannel-599000/client.crt: no such file or directory
E0213 15:42:57.750882    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/calico-599000/client.crt: no such file or directory
E0213 15:43:05.964143    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/custom-flannel-599000/client.crt: no such file or directory
E0213 15:43:10.215997    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/auto-599000/client.crt: no such file or directory
E0213 15:43:30.513872    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kindnet-599000/client.crt: no such file or directory
E0213 15:43:38.710052    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/calico-599000/client.crt: no such file or directory
start_stop_delete_test.go:186: (dbg) Done: out/minikube-darwin-amd64 start -p no-preload-355000 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.29.0-rc.2: (1m1.640089225s)
--- PASS: TestStartStop/group/no-preload/serial/FirstStart (61.64s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/DeployApp (13.23s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context no-preload-355000 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) TestStartStop/group/no-preload/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:344: "busybox" [ee0059a4-3d09-47d3-a184-8eda1b3cb550] Pending
helpers_test.go:344: "busybox" [ee0059a4-3d09-47d3-a184-8eda1b3cb550] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
E0213 15:43:46.923886    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/custom-flannel-599000/client.crt: no such file or directory
helpers_test.go:344: "busybox" [ee0059a4-3d09-47d3-a184-8eda1b3cb550] Running
E0213 15:43:50.725890    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/false-599000/client.crt: no such file or directory
E0213 15:43:50.731020    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/false-599000/client.crt: no such file or directory
E0213 15:43:50.741705    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/false-599000/client.crt: no such file or directory
E0213 15:43:50.762084    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/false-599000/client.crt: no such file or directory
E0213 15:43:50.803347    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/false-599000/client.crt: no such file or directory
E0213 15:43:50.883727    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/false-599000/client.crt: no such file or directory
E0213 15:43:51.043871    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/false-599000/client.crt: no such file or directory
E0213 15:43:51.364828    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/false-599000/client.crt: no such file or directory
E0213 15:43:52.004932    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/false-599000/client.crt: no such file or directory
start_stop_delete_test.go:196: (dbg) TestStartStop/group/no-preload/serial/DeployApp: integration-test=busybox healthy within 13.003999928s
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context no-preload-355000 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/no-preload/serial/DeployApp (13.23s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/EnableAddonWhileActive (0.76s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p no-preload-355000 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context no-preload-355000 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/no-preload/serial/EnableAddonWhileActive (0.76s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/Stop (8.27s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 stop -p no-preload-355000 --alsologtostderr -v=3
E0213 15:43:53.285347    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/false-599000/client.crt: no such file or directory
start_stop_delete_test.go:228: (dbg) Done: out/minikube-darwin-amd64 stop -p no-preload-355000 --alsologtostderr -v=3: (8.271615835s)
--- PASS: TestStartStop/group/no-preload/serial/Stop (8.27s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/DeployApp (13.32s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context old-k8s-version-481000 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) TestStartStop/group/old-k8s-version/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:344: "busybox" [a2dc9bd0-8b47-48da-9a7e-6cf08fb4b9a1] Pending
helpers_test.go:344: "busybox" [a2dc9bd0-8b47-48da-9a7e-6cf08fb4b9a1] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
E0213 15:43:55.846191    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/false-599000/client.crt: no such file or directory
helpers_test.go:344: "busybox" [a2dc9bd0-8b47-48da-9a7e-6cf08fb4b9a1] Running
E0213 15:44:00.966687    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/false-599000/client.crt: no such file or directory
start_stop_delete_test.go:196: (dbg) TestStartStop/group/old-k8s-version/serial/DeployApp: integration-test=busybox healthy within 13.002035726s
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context old-k8s-version-481000 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/old-k8s-version/serial/DeployApp (13.32s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/EnableAddonAfterStop (0.34s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-355000 -n no-preload-355000
E0213 15:44:01.288064    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/enable-default-cni-599000/client.crt: no such file or directory
E0213 15:44:01.294426    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/enable-default-cni-599000/client.crt: no such file or directory
E0213 15:44:01.304796    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/enable-default-cni-599000/client.crt: no such file or directory
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-355000 -n no-preload-355000: exit status 7 (68.168222ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p no-preload-355000 --images=MetricsScraper=registry.k8s.io/echoserver:1.4
E0213 15:44:01.325345    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/enable-default-cni-599000/client.crt: no such file or directory
E0213 15:44:01.367000    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/enable-default-cni-599000/client.crt: no such file or directory
E0213 15:44:01.447124    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/enable-default-cni-599000/client.crt: no such file or directory
--- PASS: TestStartStop/group/no-preload/serial/EnableAddonAfterStop (0.34s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/SecondStart (296.86s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p no-preload-355000 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.29.0-rc.2
E0213 15:44:01.608454    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/enable-default-cni-599000/client.crt: no such file or directory
E0213 15:44:01.929248    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/enable-default-cni-599000/client.crt: no such file or directory
E0213 15:44:02.570035    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/enable-default-cni-599000/client.crt: no such file or directory
E0213 15:44:03.850132    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/enable-default-cni-599000/client.crt: no such file or directory
E0213 15:44:03.867023    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
E0213 15:44:05.583381    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
E0213 15:44:06.411483    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/enable-default-cni-599000/client.crt: no such file or directory
start_stop_delete_test.go:256: (dbg) Done: out/minikube-darwin-amd64 start -p no-preload-355000 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.29.0-rc.2: (4m56.693335255s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-355000 -n no-preload-355000
--- PASS: TestStartStop/group/no-preload/serial/SecondStart (296.86s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive (0.69s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p old-k8s-version-481000 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context old-k8s-version-481000 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive (0.69s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/Stop (8.25s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 stop -p old-k8s-version-481000 --alsologtostderr -v=3
E0213 15:44:11.206480    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/false-599000/client.crt: no such file or directory
E0213 15:44:11.533702    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/enable-default-cni-599000/client.crt: no such file or directory
start_stop_delete_test.go:228: (dbg) Done: out/minikube-darwin-amd64 stop -p old-k8s-version-481000 --alsologtostderr -v=3: (8.252740201s)
--- PASS: TestStartStop/group/old-k8s-version/serial/Stop (8.25s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop (0.32s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p old-k8s-version-481000 -n old-k8s-version-481000
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p old-k8s-version-481000 -n old-k8s-version-481000: exit status 7 (67.40488ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p old-k8s-version-481000 --images=MetricsScraper=registry.k8s.io/echoserver:1.4
--- PASS: TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop (0.32s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/SecondStart (470.61s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p old-k8s-version-481000 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=hyperkit  --kubernetes-version=v1.16.0
E0213 15:44:21.774781    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/enable-default-cni-599000/client.crt: no such file or directory
E0213 15:44:31.685951    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/false-599000/client.crt: no such file or directory
E0213 15:44:35.813428    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
E0213 15:44:42.255942    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/enable-default-cni-599000/client.crt: no such file or directory
E0213 15:44:59.888421    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
E0213 15:45:00.629360    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/calico-599000/client.crt: no such file or directory
E0213 15:45:08.842938    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/custom-flannel-599000/client.crt: no such file or directory
E0213 15:45:12.645763    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/false-599000/client.crt: no such file or directory
E0213 15:45:23.215661    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/enable-default-cni-599000/client.crt: no such file or directory
E0213 15:45:25.064337    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/flannel-599000/client.crt: no such file or directory
E0213 15:45:25.069764    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/flannel-599000/client.crt: no such file or directory
E0213 15:45:25.081191    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/flannel-599000/client.crt: no such file or directory
E0213 15:45:25.103296    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/flannel-599000/client.crt: no such file or directory
E0213 15:45:25.144086    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/flannel-599000/client.crt: no such file or directory
E0213 15:45:25.224218    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/flannel-599000/client.crt: no such file or directory
E0213 15:45:25.385911    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/flannel-599000/client.crt: no such file or directory
E0213 15:45:25.706652    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/flannel-599000/client.crt: no such file or directory
E0213 15:45:26.348534    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/flannel-599000/client.crt: no such file or directory
E0213 15:45:26.362155    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/auto-599000/client.crt: no such file or directory
E0213 15:45:27.630617    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/flannel-599000/client.crt: no such file or directory
E0213 15:45:30.191736    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/flannel-599000/client.crt: no such file or directory
E0213 15:45:35.312799    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/flannel-599000/client.crt: no such file or directory
E0213 15:45:45.553242    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/flannel-599000/client.crt: no such file or directory
E0213 15:45:46.659538    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kindnet-599000/client.crt: no such file or directory
E0213 15:45:54.050639    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/auto-599000/client.crt: no such file or directory
E0213 15:45:58.854554    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
E0213 15:46:06.032708    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/flannel-599000/client.crt: no such file or directory
E0213 15:46:07.700049    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/bridge-599000/client.crt: no such file or directory
E0213 15:46:07.705320    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/bridge-599000/client.crt: no such file or directory
E0213 15:46:07.716901    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/bridge-599000/client.crt: no such file or directory
E0213 15:46:07.737944    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/bridge-599000/client.crt: no such file or directory
E0213 15:46:07.778886    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/bridge-599000/client.crt: no such file or directory
E0213 15:46:07.859534    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/bridge-599000/client.crt: no such file or directory
E0213 15:46:08.020291    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/bridge-599000/client.crt: no such file or directory
E0213 15:46:08.341110    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/bridge-599000/client.crt: no such file or directory
E0213 15:46:09.003457    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/bridge-599000/client.crt: no such file or directory
E0213 15:46:10.284792    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/bridge-599000/client.crt: no such file or directory
E0213 15:46:12.845523    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/bridge-599000/client.crt: no such file or directory
E0213 15:46:14.349823    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kindnet-599000/client.crt: no such file or directory
E0213 15:46:17.966383    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/bridge-599000/client.crt: no such file or directory
E0213 15:46:28.207633    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/bridge-599000/client.crt: no such file or directory
E0213 15:46:34.564661    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/false-599000/client.crt: no such file or directory
E0213 15:46:45.133006    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/enable-default-cni-599000/client.crt: no such file or directory
E0213 15:46:46.991580    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/flannel-599000/client.crt: no such file or directory
E0213 15:46:48.688255    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/bridge-599000/client.crt: no such file or directory
E0213 15:47:04.378641    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kubenet-599000/client.crt: no such file or directory
E0213 15:47:04.384198    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kubenet-599000/client.crt: no such file or directory
E0213 15:47:04.394318    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kubenet-599000/client.crt: no such file or directory
E0213 15:47:04.414551    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kubenet-599000/client.crt: no such file or directory
E0213 15:47:04.455071    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kubenet-599000/client.crt: no such file or directory
E0213 15:47:04.535982    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kubenet-599000/client.crt: no such file or directory
E0213 15:47:04.696837    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kubenet-599000/client.crt: no such file or directory
E0213 15:47:05.018230    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kubenet-599000/client.crt: no such file or directory
E0213 15:47:05.659973    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kubenet-599000/client.crt: no such file or directory
E0213 15:47:06.941507    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kubenet-599000/client.crt: no such file or directory
E0213 15:47:09.502222    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kubenet-599000/client.crt: no such file or directory
E0213 15:47:14.623506    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kubenet-599000/client.crt: no such file or directory
E0213 15:47:16.769940    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/calico-599000/client.crt: no such file or directory
E0213 15:47:24.863413    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kubenet-599000/client.crt: no such file or directory
E0213 15:47:24.989465    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/custom-flannel-599000/client.crt: no such file or directory
E0213 15:47:29.646974    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/bridge-599000/client.crt: no such file or directory
E0213 15:47:44.464031    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/calico-599000/client.crt: no such file or directory
E0213 15:47:45.343328    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kubenet-599000/client.crt: no such file or directory
E0213 15:47:52.677713    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/custom-flannel-599000/client.crt: no such file or directory
E0213 15:48:08.908907    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/flannel-599000/client.crt: no such file or directory
E0213 15:48:26.303118    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kubenet-599000/client.crt: no such file or directory
E0213 15:48:46.908223    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
E0213 15:48:50.714705    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/false-599000/client.crt: no such file or directory
E0213 15:48:51.564576    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/bridge-599000/client.crt: no such file or directory
start_stop_delete_test.go:256: (dbg) Done: out/minikube-darwin-amd64 start -p old-k8s-version-481000 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=hyperkit  --kubernetes-version=v1.16.0: (7m50.450188885s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p old-k8s-version-481000 -n old-k8s-version-481000
--- PASS: TestStartStop/group/old-k8s-version/serial/SecondStart (470.61s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/UserAppExistsAfterStop (6s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-8694d4445c-dxsn2" [567b68a5-fdf9-4979-bee5-893caa12eefc] Running
E0213 15:49:01.278943    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/enable-default-cni-599000/client.crt: no such file or directory
E0213 15:49:03.856441    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/addons-679000/client.crt: no such file or directory
start_stop_delete_test.go:274: (dbg) TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 6.003963947s
--- PASS: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop (6.00s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/AddonExistsAfterStop (5.06s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/no-preload/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-8694d4445c-dxsn2" [567b68a5-fdf9-4979-bee5-893caa12eefc] Running
E0213 15:49:05.573565    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/ingress-addon-legacy-620000/client.crt: no such file or directory
start_stop_delete_test.go:287: (dbg) TestStartStop/group/no-preload/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.002914089s
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context no-preload-355000 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/no-preload/serial/AddonExistsAfterStop (5.06s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/VerifyKubernetesImages (0.17s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-darwin-amd64 -p no-preload-355000 image list --format=json
start_stop_delete_test.go:304: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
--- PASS: TestStartStop/group/no-preload/serial/VerifyKubernetesImages (0.17s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/Pause (1.9s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 pause -p no-preload-355000 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p no-preload-355000 -n no-preload-355000
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p no-preload-355000 -n no-preload-355000: exit status 2 (159.215921ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p no-preload-355000 -n no-preload-355000
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p no-preload-355000 -n no-preload-355000: exit status 2 (161.091458ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 unpause -p no-preload-355000 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p no-preload-355000 -n no-preload-355000
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p no-preload-355000 -n no-preload-355000
--- PASS: TestStartStop/group/no-preload/serial/Pause (1.90s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/FirstStart (168.22s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p embed-certs-402000 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=hyperkit  --kubernetes-version=v1.28.4
E0213 15:49:18.399755    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/false-599000/client.crt: no such file or directory
E0213 15:49:28.968975    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/enable-default-cni-599000/client.crt: no such file or directory
E0213 15:49:35.803996    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/skaffold-220000/client.crt: no such file or directory
E0213 15:49:48.220711    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kubenet-599000/client.crt: no such file or directory
E0213 15:49:59.879154    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/functional-634000/client.crt: no such file or directory
E0213 15:50:25.052536    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/flannel-599000/client.crt: no such file or directory
E0213 15:50:26.350702    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/auto-599000/client.crt: no such file or directory
E0213 15:50:46.649893    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kindnet-599000/client.crt: no such file or directory
E0213 15:50:52.743461    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/flannel-599000/client.crt: no such file or directory
E0213 15:51:07.688607    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/bridge-599000/client.crt: no such file or directory
E0213 15:51:35.399681    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/bridge-599000/client.crt: no such file or directory
E0213 15:52:04.369405    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kubenet-599000/client.crt: no such file or directory
start_stop_delete_test.go:186: (dbg) Done: out/minikube-darwin-amd64 start -p embed-certs-402000 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=hyperkit  --kubernetes-version=v1.28.4: (2m48.219086331s)
--- PASS: TestStartStop/group/embed-certs/serial/FirstStart (168.22s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/DeployApp (14.23s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context embed-certs-402000 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) TestStartStop/group/embed-certs/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:344: "busybox" [99049035-cd95-41de-8a4a-29adc91453b7] Pending
helpers_test.go:344: "busybox" [99049035-cd95-41de-8a4a-29adc91453b7] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:344: "busybox" [99049035-cd95-41de-8a4a-29adc91453b7] Running
E0213 15:52:16.760456    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/calico-599000/client.crt: no such file or directory
start_stop_delete_test.go:196: (dbg) TestStartStop/group/embed-certs/serial/DeployApp: integration-test=busybox healthy within 14.003991177s
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context embed-certs-402000 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/embed-certs/serial/DeployApp (14.23s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop (6.01s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-84b68f675b-n5qh5" [0666ccb3-8fa6-4cb1-a213-06f6ae28afb2] Running
start_stop_delete_test.go:274: (dbg) TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 6.004398235s
--- PASS: TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop (6.01s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop (5.06s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-84b68f675b-n5qh5" [0666ccb3-8fa6-4cb1-a213-06f6ae28afb2] Running
start_stop_delete_test.go:287: (dbg) TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.002628215s
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context old-k8s-version-481000 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop (5.06s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages (0.17s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-darwin-amd64 -p old-k8s-version-481000 image list --format=json
start_stop_delete_test.go:304: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
--- PASS: TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages (0.17s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/Pause (1.77s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 pause -p old-k8s-version-481000 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p old-k8s-version-481000 -n old-k8s-version-481000
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p old-k8s-version-481000 -n old-k8s-version-481000: exit status 2 (154.017654ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p old-k8s-version-481000 -n old-k8s-version-481000
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p old-k8s-version-481000 -n old-k8s-version-481000: exit status 2 (154.073306ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 unpause -p old-k8s-version-481000 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p old-k8s-version-481000 -n old-k8s-version-481000
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p old-k8s-version-481000 -n old-k8s-version-481000
--- PASS: TestStartStop/group/old-k8s-version/serial/Pause (1.77s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/EnableAddonWhileActive (0.81s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p embed-certs-402000 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context embed-certs-402000 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/embed-certs/serial/EnableAddonWhileActive (0.81s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/Stop (8.23s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 stop -p embed-certs-402000 --alsologtostderr -v=3
start_stop_delete_test.go:228: (dbg) Done: out/minikube-darwin-amd64 stop -p embed-certs-402000 --alsologtostderr -v=3: (8.225868807s)
--- PASS: TestStartStop/group/embed-certs/serial/Stop (8.23s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/FirstStart (50.34s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p default-k8s-diff-port-603000 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.28.4
start_stop_delete_test.go:186: (dbg) Done: out/minikube-darwin-amd64 start -p default-k8s-diff-port-603000 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.28.4: (50.335933949s)
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/FirstStart (50.34s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/EnableAddonAfterStop (0.32s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p embed-certs-402000 -n embed-certs-402000
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p embed-certs-402000 -n embed-certs-402000: exit status 7 (67.727567ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p embed-certs-402000 --images=MetricsScraper=registry.k8s.io/echoserver:1.4
--- PASS: TestStartStop/group/embed-certs/serial/EnableAddonAfterStop (0.32s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/SecondStart (326.19s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p embed-certs-402000 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=hyperkit  --kubernetes-version=v1.28.4
E0213 15:52:32.056184    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/kubenet-599000/client.crt: no such file or directory
start_stop_delete_test.go:256: (dbg) Done: out/minikube-darwin-amd64 start -p embed-certs-402000 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=hyperkit  --kubernetes-version=v1.28.4: (5m26.026187273s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p embed-certs-402000 -n embed-certs-402000
--- PASS: TestStartStop/group/embed-certs/serial/SecondStart (326.19s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/DeployApp (13.23s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context default-k8s-diff-port-603000 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) TestStartStop/group/default-k8s-diff-port/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:344: "busybox" [63fa4a79-2474-4c6b-b7cc-1be5880158cb] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:344: "busybox" [63fa4a79-2474-4c6b-b7cc-1be5880158cb] Running
start_stop_delete_test.go:196: (dbg) TestStartStop/group/default-k8s-diff-port/serial/DeployApp: integration-test=busybox healthy within 13.002924349s
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context default-k8s-diff-port-603000 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/DeployApp (13.23s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/EnableAddonWhileActive (0.9s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p default-k8s-diff-port-603000 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context default-k8s-diff-port-603000 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/EnableAddonWhileActive (0.90s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/Stop (8.3s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 stop -p default-k8s-diff-port-603000 --alsologtostderr -v=3
start_stop_delete_test.go:228: (dbg) Done: out/minikube-darwin-amd64 stop -p default-k8s-diff-port-603000 --alsologtostderr -v=3: (8.300467044s)
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/Stop (8.30s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/EnableAddonAfterStop (0.32s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-diff-port-603000 -n default-k8s-diff-port-603000
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-diff-port-603000 -n default-k8s-diff-port-603000: exit status 7 (68.05242ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p default-k8s-diff-port-603000 --images=MetricsScraper=registry.k8s.io/echoserver:1.4
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/EnableAddonAfterStop (0.32s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop (6.01s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-8694d4445c-c5d8x" [5beda1ae-55dc-4878-8e8c-243db8c785f2] Running
start_stop_delete_test.go:274: (dbg) TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 6.004537776s
--- PASS: TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop (6.01s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/AddonExistsAfterStop (5.06s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-8694d4445c-c5d8x" [5beda1ae-55dc-4878-8e8c-243db8c785f2] Running
start_stop_delete_test.go:287: (dbg) TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.00282943s
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context embed-certs-402000 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/embed-certs/serial/AddonExistsAfterStop (5.06s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/VerifyKubernetesImages (0.17s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-darwin-amd64 -p embed-certs-402000 image list --format=json
start_stop_delete_test.go:304: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
--- PASS: TestStartStop/group/embed-certs/serial/VerifyKubernetesImages (0.17s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/Pause (1.9s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 pause -p embed-certs-402000 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p embed-certs-402000 -n embed-certs-402000
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p embed-certs-402000 -n embed-certs-402000: exit status 2 (162.865691ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p embed-certs-402000 -n embed-certs-402000
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p embed-certs-402000 -n embed-certs-402000: exit status 2 (160.421197ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 unpause -p embed-certs-402000 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p embed-certs-402000 -n embed-certs-402000
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p embed-certs-402000 -n embed-certs-402000
--- PASS: TestStartStop/group/embed-certs/serial/Pause (1.90s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/DeployApp (0s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/DeployApp
--- PASS: TestStartStop/group/newest-cni/serial/DeployApp (0.00s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/Stop (1.24s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 stop -p newest-cni-173000 --alsologtostderr -v=3
E0213 16:15:25.237101    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/flannel-599000/client.crt: no such file or directory
start_stop_delete_test.go:228: (dbg) Done: out/minikube-darwin-amd64 stop -p newest-cni-173000 --alsologtostderr -v=3: (1.244131894s)
--- PASS: TestStartStop/group/newest-cni/serial/Stop (1.24s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/EnableAddonAfterStop (0.32s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p newest-cni-173000 -n newest-cni-173000
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p newest-cni-173000 -n newest-cni-173000: exit status 7 (67.179904ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p newest-cni-173000 --images=MetricsScraper=registry.k8s.io/echoserver:1.4
E0213 16:15:26.533967    3342 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18169-2790/.minikube/profiles/auto-599000/client.crt: no such file or directory
--- PASS: TestStartStop/group/newest-cni/serial/EnableAddonAfterStop (0.32s)

                                                
                                    

Test skip (22/328)

x
+
TestDownloadOnly/v1.16.0/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/cached-images
aaa_download_only_test.go:129: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.16.0/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.16.0/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/binaries
aaa_download_only_test.go:151: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.16.0/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.4/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.4/cached-images
aaa_download_only_test.go:129: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.28.4/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.4/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.4/binaries
aaa_download_only_test.go:151: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.28.4/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.29.0-rc.2/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.29.0-rc.2/cached-images
aaa_download_only_test.go:129: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.29.0-rc.2/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.29.0-rc.2/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.29.0-rc.2/binaries
aaa_download_only_test.go:151: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.29.0-rc.2/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnlyKic (0s)

                                                
                                                
=== RUN   TestDownloadOnlyKic
aaa_download_only_test.go:220: skipping, only for docker or podman driver
--- SKIP: TestDownloadOnlyKic (0.00s)

                                                
                                    
x
+
TestAddons/parallel/Olm (0s)

                                                
                                                
=== RUN   TestAddons/parallel/Olm
=== PAUSE TestAddons/parallel/Olm

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Olm
addons_test.go:498: Skipping OLM addon test until https://github.com/operator-framework/operator-lifecycle-manager/issues/2534 is resolved
--- SKIP: TestAddons/parallel/Olm (0.00s)

                                                
                                    
x
+
TestDockerEnvContainerd (0s)

                                                
                                                
=== RUN   TestDockerEnvContainerd
docker_test.go:170: running with docker false darwin amd64
docker_test.go:172: skipping: TestDockerEnvContainerd can only be run with the containerd runtime on Docker driver
--- SKIP: TestDockerEnvContainerd (0.00s)

                                                
                                    
x
+
TestKVMDriverInstallOrUpdate (0s)

                                                
                                                
=== RUN   TestKVMDriverInstallOrUpdate
driver_install_or_update_test.go:41: Skip if not linux.
--- SKIP: TestKVMDriverInstallOrUpdate (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/PodmanEnv (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/PodmanEnv
=== PAUSE TestFunctional/parallel/PodmanEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PodmanEnv
functional_test.go:546: only validate podman env with docker container runtime, currently testing docker
--- SKIP: TestFunctional/parallel/PodmanEnv (0.00s)

                                                
                                    
x
+
TestGvisorAddon (0s)

                                                
                                                
=== RUN   TestGvisorAddon
gvisor_addon_test.go:34: skipping test because --gvisor=false
--- SKIP: TestGvisorAddon (0.00s)

                                                
                                    
x
+
TestImageBuild/serial/validateImageBuildWithBuildEnv (0s)

                                                
                                                
=== RUN   TestImageBuild/serial/validateImageBuildWithBuildEnv
image_test.go:114: skipping due to https://github.com/kubernetes/minikube/issues/12431
--- SKIP: TestImageBuild/serial/validateImageBuildWithBuildEnv (0.00s)

                                                
                                    
x
+
TestKicCustomNetwork (0s)

                                                
                                                
=== RUN   TestKicCustomNetwork
kic_custom_network_test.go:34: only runs with docker driver
--- SKIP: TestKicCustomNetwork (0.00s)

                                                
                                    
x
+
TestKicExistingNetwork (0s)

                                                
                                                
=== RUN   TestKicExistingNetwork
kic_custom_network_test.go:73: only runs with docker driver
--- SKIP: TestKicExistingNetwork (0.00s)

                                                
                                    
x
+
TestKicCustomSubnet (0s)

                                                
                                                
=== RUN   TestKicCustomSubnet
kic_custom_network_test.go:102: only runs with docker/podman driver
--- SKIP: TestKicCustomSubnet (0.00s)

                                                
                                    
x
+
TestKicStaticIP (0s)

                                                
                                                
=== RUN   TestKicStaticIP
kic_custom_network_test.go:123: only run with docker/podman driver
--- SKIP: TestKicStaticIP (0.00s)

                                                
                                    
x
+
TestScheduledStopWindows (0s)

                                                
                                                
=== RUN   TestScheduledStopWindows
scheduled_stop_test.go:42: test only runs on windows
--- SKIP: TestScheduledStopWindows (0.00s)

                                                
                                    
x
+
TestInsufficientStorage (0s)

                                                
                                                
=== RUN   TestInsufficientStorage
status_test.go:38: only runs with docker driver
--- SKIP: TestInsufficientStorage (0.00s)

                                                
                                    
x
+
TestMissingContainerUpgrade (0s)

                                                
                                                
=== RUN   TestMissingContainerUpgrade
version_upgrade_test.go:284: This test is only for Docker
--- SKIP: TestMissingContainerUpgrade (0.00s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium (5.94s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium
net_test.go:102: Skipping the test as it's interfering with other tests and is outdated
panic.go:523: 
----------------------- debugLogs start: cilium-599000 [pass: true] --------------------------------
>>> netcat: nslookup kubernetes.default:
Error in configuration: context was not found for specified context: cilium-599000

                                                
                                                

                                                
                                                
>>> netcat: nslookup debug kubernetes.default a-records:
Error in configuration: context was not found for specified context: cilium-599000

                                                
                                                

                                                
                                                
>>> netcat: dig search kubernetes.default:
Error in configuration: context was not found for specified context: cilium-599000

                                                
                                                

                                                
                                                
>>> netcat: dig @10.96.0.10 kubernetes.default.svc.cluster.local udp/53:
Error in configuration: context was not found for specified context: cilium-599000

                                                
                                                

                                                
                                                
>>> netcat: dig @10.96.0.10 kubernetes.default.svc.cluster.local tcp/53:
Error in configuration: context was not found for specified context: cilium-599000

                                                
                                                

                                                
                                                
>>> netcat: nc 10.96.0.10 udp/53:
Error in configuration: context was not found for specified context: cilium-599000

                                                
                                                

                                                
                                                
>>> netcat: nc 10.96.0.10 tcp/53:
Error in configuration: context was not found for specified context: cilium-599000

                                                
                                                

                                                
                                                
>>> netcat: /etc/nsswitch.conf:
Error in configuration: context was not found for specified context: cilium-599000

                                                
                                                

                                                
                                                
>>> netcat: /etc/hosts:
Error in configuration: context was not found for specified context: cilium-599000

                                                
                                                

                                                
                                                
>>> netcat: /etc/resolv.conf:
Error in configuration: context was not found for specified context: cilium-599000

                                                
                                                

                                                
                                                
>>> host: /etc/nsswitch.conf:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> host: /etc/hosts:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> host: /etc/resolv.conf:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> k8s: nodes, services, endpoints, daemon sets, deployments and pods, :
Error in configuration: context was not found for specified context: cilium-599000

                                                
                                                

                                                
                                                
>>> host: crictl pods:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> host: crictl containers:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> k8s: describe netcat deployment:
error: context "cilium-599000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe netcat pod(s):
error: context "cilium-599000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: netcat logs:
error: context "cilium-599000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe coredns deployment:
error: context "cilium-599000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe coredns pods:
error: context "cilium-599000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: coredns logs:
error: context "cilium-599000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe api server pod(s):
error: context "cilium-599000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: api server logs:
error: context "cilium-599000" does not exist

                                                
                                                

                                                
                                                
>>> host: /etc/cni:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> host: ip a s:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> host: ip r s:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> host: iptables-save:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> host: iptables table nat:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> k8s: describe cilium daemon set:
Error in configuration: context was not found for specified context: cilium-599000

                                                
                                                

                                                
                                                
>>> k8s: describe cilium daemon set pod(s):
Error in configuration: context was not found for specified context: cilium-599000

                                                
                                                

                                                
                                                
>>> k8s: cilium daemon set container(s) logs (current):
error: context "cilium-599000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: cilium daemon set container(s) logs (previous):
error: context "cilium-599000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe cilium deployment:
Error in configuration: context was not found for specified context: cilium-599000

                                                
                                                

                                                
                                                
>>> k8s: describe cilium deployment pod(s):
Error in configuration: context was not found for specified context: cilium-599000

                                                
                                                

                                                
                                                
>>> k8s: cilium deployment container(s) logs (current):
error: context "cilium-599000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: cilium deployment container(s) logs (previous):
error: context "cilium-599000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe kube-proxy daemon set:
error: context "cilium-599000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe kube-proxy pod(s):
error: context "cilium-599000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: kube-proxy logs:
error: context "cilium-599000" does not exist

                                                
                                                

                                                
                                                
>>> host: kubelet daemon status:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> host: kubelet daemon config:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> k8s: kubelet logs:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> host: /etc/kubernetes/kubelet.conf:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> host: /var/lib/kubelet/config.yaml:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> k8s: kubectl config:
apiVersion: v1
clusters: null
contexts: null
current-context: ""
kind: Config
preferences: {}
users: null

                                                
                                                

                                                
                                                
>>> k8s: cms:
Error in configuration: context was not found for specified context: cilium-599000

                                                
                                                

                                                
                                                
>>> host: docker daemon status:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> host: docker daemon config:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> host: /etc/docker/daemon.json:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> host: docker system info:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> host: cri-docker daemon status:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> host: cri-docker daemon config:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> host: /etc/systemd/system/cri-docker.service.d/10-cni.conf:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> host: /usr/lib/systemd/system/cri-docker.service:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> host: cri-dockerd version:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> host: containerd daemon status:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> host: containerd daemon config:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> host: /lib/systemd/system/containerd.service:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> host: /etc/containerd/config.toml:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> host: containerd config dump:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> host: crio daemon status:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> host: crio daemon config:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> host: /etc/crio:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                

                                                
                                                
>>> host: crio config:
* Profile "cilium-599000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-599000"

                                                
                                                
----------------------- debugLogs end: cilium-599000 [took: 5.567809432s] --------------------------------
helpers_test.go:175: Cleaning up "cilium-599000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p cilium-599000
--- SKIP: TestNetworkPlugins/group/cilium (5.94s)

                                                
                                    
x
+
TestStartStop/group/disable-driver-mounts (0.41s)

                                                
                                                
=== RUN   TestStartStop/group/disable-driver-mounts
=== PAUSE TestStartStop/group/disable-driver-mounts

                                                
                                                

                                                
                                                
=== CONT  TestStartStop/group/disable-driver-mounts
start_stop_delete_test.go:103: skipping TestStartStop/group/disable-driver-mounts - only runs on virtualbox
helpers_test.go:175: Cleaning up "disable-driver-mounts-376000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p disable-driver-mounts-376000
--- SKIP: TestStartStop/group/disable-driver-mounts (0.41s)

                                                
                                    
Copied to clipboard