Test Report: Hyperkit_macOS 17822

                    
                      1b14f6e8a127ccddfb64acb15c203e20bb49b800:2023-12-18:32341
                    
                

Test fail (11/318)

x
+
TestImageBuild/serial/Setup (17.02s)

                                                
                                                
=== RUN   TestImageBuild/serial/Setup
image_test.go:69: (dbg) Run:  out/minikube-darwin-amd64 start -p image-900000 --driver=hyperkit 
image_test.go:69: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p image-900000 --driver=hyperkit : exit status 90 (16.868397077s)

                                                
                                                
-- stdout --
	* [image-900000] minikube v1.32.0 on Darwin 14.2
	  - MINIKUBE_LOCATION=17822
	  - KUBECONFIG=/Users/jenkins/minikube-integration/17822-999/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17822-999/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on user configuration
	* Starting control plane node image-900000 in cluster image-900000
	* Creating hyperkit VM (CPUs=2, Memory=6000MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to RUNTIME_ENABLE: Failed to enable container runtime: sudo systemctl restart cri-docker.socket: Process exited with status 1
	stdout:
	
	stderr:
	Job failed. See "journalctl -xe" for details.
	
	sudo journalctl --no-pager -u cri-docker.socket:
	-- stdout --
	-- Journal begins at Mon 2023-12-18 22:46:52 UTC, ends at Mon 2023-12-18 22:46:58 UTC. --
	Dec 18 22:46:53 minikube systemd[1]: Starting CRI Docker Socket for the API.
	Dec 18 22:46:53 minikube systemd[1]: Listening on CRI Docker Socket for the API.
	Dec 18 22:46:55 image-900000 systemd[1]: cri-docker.socket: Succeeded.
	Dec 18 22:46:55 image-900000 systemd[1]: Closed CRI Docker Socket for the API.
	Dec 18 22:46:55 image-900000 systemd[1]: Stopping CRI Docker Socket for the API.
	Dec 18 22:46:55 image-900000 systemd[1]: Starting CRI Docker Socket for the API.
	Dec 18 22:46:55 image-900000 systemd[1]: Listening on CRI Docker Socket for the API.
	Dec 18 22:46:58 image-900000 systemd[1]: cri-docker.socket: Succeeded.
	Dec 18 22:46:58 image-900000 systemd[1]: Closed CRI Docker Socket for the API.
	Dec 18 22:46:58 image-900000 systemd[1]: Stopping CRI Docker Socket for the API.
	Dec 18 22:46:58 image-900000 systemd[1]: cri-docker.socket: Socket service cri-docker.service already active, refusing.
	Dec 18 22:46:58 image-900000 systemd[1]: Failed to listen on CRI Docker Socket for the API.
	
	-- /stdout --
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
image_test.go:70: failed to start minikube with args: "out/minikube-darwin-amd64 start -p image-900000 --driver=hyperkit " : exit status 90
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p image-900000 -n image-900000
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p image-900000 -n image-900000: exit status 6 (153.300101ms)

                                                
                                                
-- stdout --
	Running
	WARNING: Your kubectl is pointing to stale minikube-vm.
	To fix the kubectl context, run `minikube update-context`

                                                
                                                
-- /stdout --
** stderr ** 
	E1218 14:46:59.025145    2839 status.go:415] kubeconfig endpoint: extract IP: "image-900000" does not appear in /Users/jenkins/minikube-integration/17822-999/kubeconfig

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 6 (may be ok)
helpers_test.go:241: "image-900000" host is not running, skipping log retrieval (state="Running\nWARNING: Your kubectl is pointing to stale minikube-vm.\nTo fix the kubectl context, run `minikube update-context`")
--- FAIL: TestImageBuild/serial/Setup (17.02s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/FirstStart (14.95s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p no-preload-994000 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.29.0-rc.2
E1218 15:27:22.801503    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/calico-302000/client.crt: no such file or directory
E1218 15:27:23.316567    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/custom-flannel-302000/client.crt: no such file or directory
start_stop_delete_test.go:186: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p no-preload-994000 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.29.0-rc.2: exit status 90 (14.788373166s)

                                                
                                                
-- stdout --
	* [no-preload-994000] minikube v1.32.0 on Darwin 14.2
	  - MINIKUBE_LOCATION=17822
	  - KUBECONFIG=/Users/jenkins/minikube-integration/17822-999/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17822-999/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on user configuration
	* Starting control plane node no-preload-994000 in cluster no-preload-994000
	* Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 15:27:20.723427    7401 out.go:296] Setting OutFile to fd 1 ...
	I1218 15:27:20.723951    7401 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1218 15:27:20.723958    7401 out.go:309] Setting ErrFile to fd 2...
	I1218 15:27:20.723963    7401 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1218 15:27:20.724263    7401 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17822-999/.minikube/bin
	I1218 15:27:20.726478    7401 out.go:303] Setting JSON to false
	I1218 15:27:20.749710    7401 start.go:128] hostinfo: {"hostname":"MacOS-Agent-2.local","uptime":3411,"bootTime":1702938629,"procs":507,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.2","kernelVersion":"23.2.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"2965c349-98a5-5970-aaa9-9eedd3ae5959"}
	W1218 15:27:20.749812    7401 start.go:136] gopshost.Virtualization returned error: not implemented yet
	I1218 15:27:20.772462    7401 out.go:177] * [no-preload-994000] minikube v1.32.0 on Darwin 14.2
	I1218 15:27:20.817327    7401 out.go:177]   - MINIKUBE_LOCATION=17822
	I1218 15:27:20.817378    7401 notify.go:220] Checking for updates...
	I1218 15:27:20.841655    7401 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/17822-999/kubeconfig
	I1218 15:27:20.863420    7401 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I1218 15:27:20.884224    7401 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 15:27:20.906328    7401 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17822-999/.minikube
	I1218 15:27:20.927384    7401 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 15:27:20.949240    7401 config.go:182] Loaded profile config "old-k8s-version-920000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.16.0
	I1218 15:27:20.949394    7401 driver.go:392] Setting default libvirt URI to qemu:///system
	I1218 15:27:20.979232    7401 out.go:177] * Using the hyperkit driver based on user configuration
	I1218 15:27:21.021149    7401 start.go:298] selected driver: hyperkit
	I1218 15:27:21.021182    7401 start.go:902] validating driver "hyperkit" against <nil>
	I1218 15:27:21.021212    7401 start.go:913] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 15:27:21.025340    7401 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 15:27:21.025446    7401 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/17822-999/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I1218 15:27:21.033229    7401 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.32.0
	I1218 15:27:21.037021    7401 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 15:27:21.037041    7401 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I1218 15:27:21.037074    7401 start_flags.go:309] no existing cluster config was found, will generate one from the flags 
	I1218 15:27:21.037272    7401 start_flags.go:931] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1218 15:27:21.037338    7401 cni.go:84] Creating CNI manager for ""
	I1218 15:27:21.037354    7401 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I1218 15:27:21.037363    7401 start_flags.go:318] Found "bridge CNI" CNI - setting NetworkPlugin=cni
	I1218 15:27:21.037370    7401 start_flags.go:323] config:
	{Name:no-preload-994000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1702920864-17822@sha256:4842b362f06b33d847d73f7ed166c93ce608f4c4cea49b711c7055fd50ebd1e0 Memory:2200 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.29.0-rc.2 ClusterName:no-preload-994000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerR
untime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 AutoPauseInterval:1m0s GPUs:}
	I1218 15:27:21.037505    7401 iso.go:125] acquiring lock: {Name:mk6c2133f2dd3312b15d4fc195383881e10096e9 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 15:27:21.079162    7401 out.go:177] * Starting control plane node no-preload-994000 in cluster no-preload-994000
	I1218 15:27:21.100097    7401 preload.go:132] Checking if preload exists for k8s version v1.29.0-rc.2 and runtime docker
	I1218 15:27:21.100275    7401 profile.go:148] Saving config to /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/no-preload-994000/config.json ...
	I1218 15:27:21.100330    7401 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/no-preload-994000/config.json: {Name:mk8e056f1bfec088bc755d1ed40fa36a609c1d49 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 15:27:21.100348    7401 cache.go:107] acquiring lock: {Name:mk42dacd7e8ee680e7136967b23a0057ee9799f1 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 15:27:21.100420    7401 cache.go:107] acquiring lock: {Name:mk7bcde6ebac668831bc4338f3e4e56dfbd34ef4 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 15:27:21.100453    7401 cache.go:107] acquiring lock: {Name:mka87075a91d7ec81a224018fb66f82883aee775 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 15:27:21.100502    7401 cache.go:115] /Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1218 15:27:21.100492    7401 cache.go:107] acquiring lock: {Name:mk12c52a18a5b3837c561e2a99cb6c11383ac3c7 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 15:27:21.100537    7401 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/gcr.io/k8s-minikube/storage-provisioner_v5" took 198.387µs
	I1218 15:27:21.100388    7401 cache.go:107] acquiring lock: {Name:mkebb22defd5c2c4bb98251aa5aeb38a633885aa Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 15:27:21.100555    7401 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1218 15:27:21.100583    7401 cache.go:107] acquiring lock: {Name:mk9809f5d047b800623ec947f89b24512ab9b671 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 15:27:21.100642    7401 cache.go:107] acquiring lock: {Name:mk620e20c3c87f1cefa52d89f4652311fd10ae26 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 15:27:21.100633    7401 cache.go:107] acquiring lock: {Name:mk366a6ab19e9442c90c91545341f8eb5131245f Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 15:27:21.100811    7401 image.go:134] retrieving image: registry.k8s.io/kube-scheduler:v1.29.0-rc.2
	I1218 15:27:21.100838    7401 image.go:134] retrieving image: registry.k8s.io/kube-controller-manager:v1.29.0-rc.2
	I1218 15:27:21.101069    7401 image.go:134] retrieving image: registry.k8s.io/kube-apiserver:v1.29.0-rc.2
	I1218 15:27:21.101105    7401 image.go:134] retrieving image: registry.k8s.io/kube-proxy:v1.29.0-rc.2
	I1218 15:27:21.101223    7401 image.go:134] retrieving image: registry.k8s.io/pause:3.9
	I1218 15:27:21.101294    7401 image.go:134] retrieving image: registry.k8s.io/etcd:3.5.10-0
	I1218 15:27:21.101292    7401 start.go:365] acquiring machines lock for no-preload-994000: {Name:mk129da0b7e14236047c6f70b7fc622a9cc1d994 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I1218 15:27:21.101336    7401 image.go:134] retrieving image: registry.k8s.io/coredns/coredns:v1.11.1
	I1218 15:27:21.101409    7401 start.go:369] acquired machines lock for "no-preload-994000" in 79.838µs
	I1218 15:27:21.101455    7401 start.go:93] Provisioning new machine with config: &{Name:no-preload-994000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/17806/minikube-v1.32.1-1702708929-17806-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1702920864-17822@sha256:4842b362f06b33d847d73f7ed166c93ce608f4c4cea49b711c7055fd50ebd1e0 Memory:2200 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 Kubernet
esConfig:{KubernetesVersion:v1.29.0-rc.2 ClusterName:no-preload-994000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.29.0-rc.2 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: Mount
MSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 AutoPauseInterval:1m0s GPUs:} &{Name: IP: Port:8443 KubernetesVersion:v1.29.0-rc.2 ContainerRuntime:docker ControlPlane:true Worker:true}
	I1218 15:27:21.101582    7401 start.go:125] createHost starting for "" (driver="hyperkit")
	I1218 15:27:21.123413    7401 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I1218 15:27:21.123807    7401 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 15:27:21.123871    7401 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 15:27:21.126745    7401 image.go:177] daemon lookup for registry.k8s.io/etcd:3.5.10-0: Error response from daemon: No such image: registry.k8s.io/etcd:3.5.10-0
	I1218 15:27:21.127695    7401 image.go:177] daemon lookup for registry.k8s.io/kube-proxy:v1.29.0-rc.2: Error response from daemon: No such image: registry.k8s.io/kube-proxy:v1.29.0-rc.2
	I1218 15:27:21.127718    7401 image.go:177] daemon lookup for registry.k8s.io/kube-apiserver:v1.29.0-rc.2: Error response from daemon: No such image: registry.k8s.io/kube-apiserver:v1.29.0-rc.2
	I1218 15:27:21.128280    7401 image.go:177] daemon lookup for registry.k8s.io/coredns/coredns:v1.11.1: Error response from daemon: No such image: registry.k8s.io/coredns/coredns:v1.11.1
	I1218 15:27:21.128317    7401 image.go:177] daemon lookup for registry.k8s.io/kube-controller-manager:v1.29.0-rc.2: Error response from daemon: No such image: registry.k8s.io/kube-controller-manager:v1.29.0-rc.2
	I1218 15:27:21.128311    7401 image.go:177] daemon lookup for registry.k8s.io/pause:3.9: Error response from daemon: No such image: registry.k8s.io/pause:3.9
	I1218 15:27:21.128365    7401 image.go:177] daemon lookup for registry.k8s.io/kube-scheduler:v1.29.0-rc.2: Error response from daemon: No such image: registry.k8s.io/kube-scheduler:v1.29.0-rc.2
	I1218 15:27:21.134180    7401 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:56200
	I1218 15:27:21.134531    7401 main.go:141] libmachine: () Calling .GetVersion
	I1218 15:27:21.134973    7401 main.go:141] libmachine: Using API Version  1
	I1218 15:27:21.134985    7401 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 15:27:21.135204    7401 main.go:141] libmachine: () Calling .GetMachineName
	I1218 15:27:21.135312    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetMachineName
	I1218 15:27:21.135396    7401 main.go:141] libmachine: (no-preload-994000) Calling .DriverName
	I1218 15:27:21.135515    7401 start.go:159] libmachine.API.Create for "no-preload-994000" (driver="hyperkit")
	I1218 15:27:21.135546    7401 client.go:168] LocalClient.Create starting
	I1218 15:27:21.135578    7401 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/17822-999/.minikube/certs/ca.pem
	I1218 15:27:21.135627    7401 main.go:141] libmachine: Decoding PEM data...
	I1218 15:27:21.135644    7401 main.go:141] libmachine: Parsing certificate...
	I1218 15:27:21.135702    7401 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/17822-999/.minikube/certs/cert.pem
	I1218 15:27:21.135737    7401 main.go:141] libmachine: Decoding PEM data...
	I1218 15:27:21.135751    7401 main.go:141] libmachine: Parsing certificate...
	I1218 15:27:21.135764    7401 main.go:141] libmachine: Running pre-create checks...
	I1218 15:27:21.135773    7401 main.go:141] libmachine: (no-preload-994000) Calling .PreCreateCheck
	I1218 15:27:21.135860    7401 main.go:141] libmachine: (no-preload-994000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:27:21.136040    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetConfigRaw
	I1218 15:27:21.136537    7401 main.go:141] libmachine: Creating machine...
	I1218 15:27:21.136546    7401 main.go:141] libmachine: (no-preload-994000) Calling .Create
	I1218 15:27:21.136624    7401 main.go:141] libmachine: (no-preload-994000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:27:21.136790    7401 main.go:141] libmachine: (no-preload-994000) DBG | I1218 15:27:21.136616    7409 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/17822-999/.minikube
	I1218 15:27:21.136850    7401 main.go:141] libmachine: (no-preload-994000) Downloading /Users/jenkins/minikube-integration/17822-999/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/17822-999/.minikube/cache/iso/amd64/minikube-v1.32.1-1702708929-17806-amd64.iso...
	I1218 15:27:21.373931    7401 main.go:141] libmachine: (no-preload-994000) DBG | I1218 15:27:21.373714    7409 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/17822-999/.minikube/machines/no-preload-994000/id_rsa...
	I1218 15:27:21.464749    7401 main.go:141] libmachine: (no-preload-994000) DBG | I1218 15:27:21.464530    7409 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/17822-999/.minikube/machines/no-preload-994000/no-preload-994000.rawdisk...
	I1218 15:27:21.464823    7401 main.go:141] libmachine: (no-preload-994000) DBG | Writing magic tar header
	I1218 15:27:21.464849    7401 main.go:141] libmachine: (no-preload-994000) DBG | Writing SSH key tar header
	I1218 15:27:21.465205    7401 main.go:141] libmachine: (no-preload-994000) DBG | I1218 15:27:21.465122    7409 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/17822-999/.minikube/machines/no-preload-994000 ...
	I1218 15:27:21.604786    7401 cache.go:162] opening:  /Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/registry.k8s.io/etcd_3.5.10-0
	I1218 15:27:21.607059    7401 cache.go:162] opening:  /Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/registry.k8s.io/kube-apiserver_v1.29.0-rc.2
	I1218 15:27:21.625569    7401 cache.go:162] opening:  /Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/registry.k8s.io/kube-proxy_v1.29.0-rc.2
	I1218 15:27:21.631321    7401 cache.go:162] opening:  /Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/registry.k8s.io/coredns/coredns_v1.11.1
	I1218 15:27:21.653139    7401 cache.go:162] opening:  /Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/registry.k8s.io/kube-controller-manager_v1.29.0-rc.2
	I1218 15:27:21.700510    7401 cache.go:162] opening:  /Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/registry.k8s.io/pause_3.9
	I1218 15:27:21.753984    7401 cache.go:162] opening:  /Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/registry.k8s.io/kube-scheduler_v1.29.0-rc.2
	I1218 15:27:21.798387    7401 main.go:141] libmachine: (no-preload-994000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:27:21.798406    7401 main.go:141] libmachine: (no-preload-994000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/17822-999/.minikube/machines/no-preload-994000/hyperkit.pid
	I1218 15:27:21.798437    7401 main.go:141] libmachine: (no-preload-994000) DBG | Using UUID fdd8e824-9dfc-11ee-8fe5-f01898ef957c
	I1218 15:27:21.819203    7401 cache.go:157] /Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/registry.k8s.io/pause_3.9 exists
	I1218 15:27:21.819225    7401 cache.go:96] cache image "registry.k8s.io/pause:3.9" -> "/Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/registry.k8s.io/pause_3.9" took 718.646893ms
	I1218 15:27:21.819243    7401 cache.go:80] save to tar file registry.k8s.io/pause:3.9 -> /Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/registry.k8s.io/pause_3.9 succeeded
	I1218 15:27:21.834755    7401 main.go:141] libmachine: (no-preload-994000) DBG | Generated MAC e2:61:5d:1b:8b:e
	I1218 15:27:21.834777    7401 main.go:141] libmachine: (no-preload-994000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=no-preload-994000
	I1218 15:27:21.834806    7401 main.go:141] libmachine: (no-preload-994000) DBG | 2023/12/18 15:27:21 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/no-preload-994000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"fdd8e824-9dfc-11ee-8fe5-f01898ef957c", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0002065d0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/no-preload-994000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/no-preload-994000/bzimage", Initrd:"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/no-preload-994000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Proc
ess)(nil)}
	I1218 15:27:21.834844    7401 main.go:141] libmachine: (no-preload-994000) DBG | 2023/12/18 15:27:21 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/no-preload-994000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"fdd8e824-9dfc-11ee-8fe5-f01898ef957c", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0002065d0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/no-preload-994000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/no-preload-994000/bzimage", Initrd:"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/no-preload-994000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Proc
ess)(nil)}
	I1218 15:27:21.834887    7401 main.go:141] libmachine: (no-preload-994000) DBG | 2023/12/18 15:27:21 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/17822-999/.minikube/machines/no-preload-994000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "fdd8e824-9dfc-11ee-8fe5-f01898ef957c", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/no-preload-994000/no-preload-994000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/no-preload-994000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/17822-999/.minikube/machines/no-preload-994000/tty,log=/Users/jenkins/minikube-integration/17822-999/.minikube/machines/no-preload-994000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/no-preload-994000/bzimage,/Users/jenkins/minikube-integration/17
822-999/.minikube/machines/no-preload-994000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=no-preload-994000"}
	I1218 15:27:21.834936    7401 main.go:141] libmachine: (no-preload-994000) DBG | 2023/12/18 15:27:21 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/17822-999/.minikube/machines/no-preload-994000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U fdd8e824-9dfc-11ee-8fe5-f01898ef957c -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/no-preload-994000/no-preload-994000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/no-preload-994000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/17822-999/.minikube/machines/no-preload-994000/tty,log=/Users/jenkins/minikube-integration/17822-999/.minikube/machines/no-preload-994000/console-ring -f kexec,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/no-preload-994000/bzimage,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/no-preload-994000/initrd,earlyprint
k=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=no-preload-994000"
	I1218 15:27:21.834949    7401 main.go:141] libmachine: (no-preload-994000) DBG | 2023/12/18 15:27:21 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I1218 15:27:21.837935    7401 main.go:141] libmachine: (no-preload-994000) DBG | 2023/12/18 15:27:21 DEBUG: hyperkit: Pid is 7438
	I1218 15:27:21.838394    7401 main.go:141] libmachine: (no-preload-994000) DBG | Attempt 0
	I1218 15:27:21.838419    7401 main.go:141] libmachine: (no-preload-994000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:27:21.838488    7401 main.go:141] libmachine: (no-preload-994000) DBG | hyperkit pid from json: 7438
	I1218 15:27:21.839578    7401 main.go:141] libmachine: (no-preload-994000) DBG | Searching for e2:61:5d:1b:8b:e in /var/db/dhcpd_leases ...
	I1218 15:27:21.839702    7401 main.go:141] libmachine: (no-preload-994000) DBG | Found 40 entries in /var/db/dhcpd_leases!
	I1218 15:27:21.839715    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.41 HWAddress:5a:bb:86:81:7e:3b ID:1,5a:bb:86:81:7e:3b Lease:0x65822679}
	I1218 15:27:21.839742    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.40 HWAddress:be:ae:7:4a:b1:e7 ID:1,be:ae:7:4a:b1:e7 Lease:0x6582266c}
	I1218 15:27:21.839756    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.39 HWAddress:e:b8:aa:f6:3d:6 ID:1,e:b8:aa:f6:3d:6 Lease:0x6582261f}
	I1218 15:27:21.839772    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.38 HWAddress:ea:97:1e:7c:d6:8 ID:1,ea:97:1e:7c:d6:8 Lease:0x65822611}
	I1218 15:27:21.839786    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.37 HWAddress:e6:c4:e5:78:db:a5 ID:1,e6:c4:e5:78:db:a5 Lease:0x658225cd}
	I1218 15:27:21.839804    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.36 HWAddress:a:27:22:b0:d3:1 ID:1,a:27:22:b0:d3:1 Lease:0x658225c1}
	I1218 15:27:21.839823    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.35 HWAddress:8e:cf:da:63:d8:fe ID:1,8e:cf:da:63:d8:fe Lease:0x65822569}
	I1218 15:27:21.839833    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.34 HWAddress:6a:31:a1:4d:bb:f5 ID:1,6a:31:a1:4d:bb:f5 Lease:0x65822553}
	I1218 15:27:21.839842    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.33 HWAddress:6:c6:98:eb:ea:2c ID:1,6:c6:98:eb:ea:2c Lease:0x6582250a}
	I1218 15:27:21.839850    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.32 HWAddress:3e:79:5c:d9:7a:33 ID:1,3e:79:5c:d9:7a:33 Lease:0x658224d9}
	I1218 15:27:21.839859    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.31 HWAddress:52:2e:d4:42:e7:3c ID:1,52:2e:d4:42:e7:3c Lease:0x6580d37e}
	I1218 15:27:21.839870    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.30 HWAddress:ba:66:f5:b8:78:5d ID:1,ba:66:f5:b8:78:5d Lease:0x6580d339}
	I1218 15:27:21.839881    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.29 HWAddress:da:74:f2:6a:f8:f1 ID:1,da:74:f2:6a:f8:f1 Lease:0x65822482}
	I1218 15:27:21.839889    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.28 HWAddress:46:30:e8:d5:b4:79 ID:1,46:30:e8:d5:b4:79 Lease:0x6582244f}
	I1218 15:27:21.839898    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.27 HWAddress:92:b7:f6:53:5e:35 ID:1,92:b7:f6:53:5e:35 Lease:0x6580d2f7}
	I1218 15:27:21.839905    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.26 HWAddress:76:7f:aa:f1:b6:93 ID:1,76:7f:aa:f1:b6:93 Lease:0x6582230f}
	I1218 15:27:21.839916    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:e:d0:af:3b:93:af ID:1,e:d0:af:3b:93:af Lease:0x658222da}
	I1218 15:27:21.839932    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:7e:bb:25:ab:d4:54 ID:1,7e:bb:25:ab:d4:54 Lease:0x658222cd}
	I1218 15:27:21.839945    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:96:7b:cc:a9:f5:5a ID:1,96:7b:cc:a9:f5:5a Lease:0x6580d150}
	I1218 15:27:21.839962    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:6e:2c:bf:e5:d4:c8 ID:1,6e:2c:bf:e5:d4:c8 Lease:0x6580d142}
	I1218 15:27:21.839986    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:36:d8:32:a:3c:31 ID:1,36:d8:32:a:3c:31 Lease:0x65822280}
	I1218 15:27:21.840003    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:62:20:33:b0:56:e5 ID:1,62:20:33:b0:56:e5 Lease:0x65822268}
	I1218 15:27:21.840019    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:6e:47:1e:5f:3e:2 ID:1,6e:47:1e:5f:3e:2 Lease:0x658221f9}
	I1218 15:27:21.840033    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ae:34:98:ff:97:8d ID:1,ae:34:98:ff:97:8d Lease:0x6582218f}
	I1218 15:27:21.840046    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:ce:68:21:fe:24:bd ID:1,ce:68:21:fe:24:bd Lease:0x65822141}
	I1218 15:27:21.840064    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:6e:45:2:96:10:89 ID:1,6e:45:2:96:10:89 Lease:0x658220ac}
	I1218 15:27:21.840078    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:90:9b:fb:6a:2d ID:1,66:90:9b:fb:6a:2d Lease:0x6580ceb2}
	I1218 15:27:21.840091    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:e6:18:c3:b6:10:29 ID:1,e6:18:c3:b6:10:29 Lease:0x65822080}
	I1218 15:27:21.840113    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:7e:6c:62:57:be:53 ID:1,7e:6c:62:57:be:53 Lease:0x6582204c}
	I1218 15:27:21.840125    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:a:e:6c:4:f9:97 ID:1,a:e:6c:4:f9:97 Lease:0x6580cd3c}
	I1218 15:27:21.840138    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:c2:cb:5b:6d:94:c5 ID:1,c2:cb:5b:6d:94:c5 Lease:0x6580cd26}
	I1218 15:27:21.840152    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:fa:44:a:25:8e:6b ID:1,fa:44:a:25:8e:6b Lease:0x65821e5d}
	I1218 15:27:21.840163    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:1a:fd:bf:11:b7:85 ID:1,1a:fd:bf:11:b7:85 Lease:0x65821e36}
	I1218 15:27:21.840176    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:26:1d:41:79:69:22 ID:1,26:1d:41:79:69:22 Lease:0x65821dfa}
	I1218 15:27:21.840206    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:32:57:7:64:35:d8 ID:1,32:57:7:64:35:d8 Lease:0x65821d79}
	I1218 15:27:21.840221    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:16:1a:98:4e:b8:45 ID:1,16:1a:98:4e:b8:45 Lease:0x65821d5d}
	I1218 15:27:21.840235    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:4e:ee:a9:b6:7e:85 ID:1,4e:ee:a9:b6:7e:85 Lease:0x65821c78}
	I1218 15:27:21.840244    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:3a:b1:4e:f8:70:be ID:1,3a:b1:4e:f8:70:be Lease:0x6580caec}
	I1218 15:27:21.840251    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:2e:85:f3:9a:3:e4 ID:1,2e:85:f3:9a:3:e4 Lease:0x65821b3b}
	I1218 15:27:21.840263    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name: IPAddress:192.169.0.2 HWAddress:82:2b:f6:5b:7f:bf ID:1,82:2b:f6:5b:7f:bf Lease:0x658219d1}
	I1218 15:27:21.845566    7401 main.go:141] libmachine: (no-preload-994000) DBG | 2023/12/18 15:27:21 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I1218 15:27:21.854174    7401 main.go:141] libmachine: (no-preload-994000) DBG | 2023/12/18 15:27:21 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/17822-999/.minikube/machines/no-preload-994000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I1218 15:27:21.855102    7401 main.go:141] libmachine: (no-preload-994000) DBG | 2023/12/18 15:27:21 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I1218 15:27:21.855118    7401 main.go:141] libmachine: (no-preload-994000) DBG | 2023/12/18 15:27:21 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I1218 15:27:21.855139    7401 main.go:141] libmachine: (no-preload-994000) DBG | 2023/12/18 15:27:21 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I1218 15:27:21.855161    7401 main.go:141] libmachine: (no-preload-994000) DBG | 2023/12/18 15:27:21 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I1218 15:27:22.224797    7401 main.go:141] libmachine: (no-preload-994000) DBG | 2023/12/18 15:27:22 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I1218 15:27:22.224813    7401 main.go:141] libmachine: (no-preload-994000) DBG | 2023/12/18 15:27:22 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I1218 15:27:22.328805    7401 main.go:141] libmachine: (no-preload-994000) DBG | 2023/12/18 15:27:22 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I1218 15:27:22.328820    7401 main.go:141] libmachine: (no-preload-994000) DBG | 2023/12/18 15:27:22 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I1218 15:27:22.328831    7401 main.go:141] libmachine: (no-preload-994000) DBG | 2023/12/18 15:27:22 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I1218 15:27:22.328847    7401 main.go:141] libmachine: (no-preload-994000) DBG | 2023/12/18 15:27:22 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I1218 15:27:22.329758    7401 main.go:141] libmachine: (no-preload-994000) DBG | 2023/12/18 15:27:22 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I1218 15:27:22.329771    7401 main.go:141] libmachine: (no-preload-994000) DBG | 2023/12/18 15:27:22 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I1218 15:27:23.840825    7401 main.go:141] libmachine: (no-preload-994000) DBG | Attempt 1
	I1218 15:27:23.840842    7401 main.go:141] libmachine: (no-preload-994000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:27:23.840900    7401 main.go:141] libmachine: (no-preload-994000) DBG | hyperkit pid from json: 7438
	I1218 15:27:23.841738    7401 main.go:141] libmachine: (no-preload-994000) DBG | Searching for e2:61:5d:1b:8b:e in /var/db/dhcpd_leases ...
	I1218 15:27:23.841817    7401 main.go:141] libmachine: (no-preload-994000) DBG | Found 40 entries in /var/db/dhcpd_leases!
	I1218 15:27:23.841838    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.41 HWAddress:5a:bb:86:81:7e:3b ID:1,5a:bb:86:81:7e:3b Lease:0x65822679}
	I1218 15:27:23.841861    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.40 HWAddress:be:ae:7:4a:b1:e7 ID:1,be:ae:7:4a:b1:e7 Lease:0x6582266c}
	I1218 15:27:23.841883    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.39 HWAddress:e:b8:aa:f6:3d:6 ID:1,e:b8:aa:f6:3d:6 Lease:0x6582261f}
	I1218 15:27:23.841916    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.38 HWAddress:ea:97:1e:7c:d6:8 ID:1,ea:97:1e:7c:d6:8 Lease:0x65822611}
	I1218 15:27:23.841939    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.37 HWAddress:e6:c4:e5:78:db:a5 ID:1,e6:c4:e5:78:db:a5 Lease:0x658225cd}
	I1218 15:27:23.841954    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.36 HWAddress:a:27:22:b0:d3:1 ID:1,a:27:22:b0:d3:1 Lease:0x658225c1}
	I1218 15:27:23.841964    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.35 HWAddress:8e:cf:da:63:d8:fe ID:1,8e:cf:da:63:d8:fe Lease:0x65822569}
	I1218 15:27:23.841984    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.34 HWAddress:6a:31:a1:4d:bb:f5 ID:1,6a:31:a1:4d:bb:f5 Lease:0x65822553}
	I1218 15:27:23.841994    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.33 HWAddress:6:c6:98:eb:ea:2c ID:1,6:c6:98:eb:ea:2c Lease:0x6582250a}
	I1218 15:27:23.842008    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.32 HWAddress:3e:79:5c:d9:7a:33 ID:1,3e:79:5c:d9:7a:33 Lease:0x658224d9}
	I1218 15:27:23.842022    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.31 HWAddress:52:2e:d4:42:e7:3c ID:1,52:2e:d4:42:e7:3c Lease:0x6580d37e}
	I1218 15:27:23.842033    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.30 HWAddress:ba:66:f5:b8:78:5d ID:1,ba:66:f5:b8:78:5d Lease:0x6580d339}
	I1218 15:27:23.842044    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.29 HWAddress:da:74:f2:6a:f8:f1 ID:1,da:74:f2:6a:f8:f1 Lease:0x65822482}
	I1218 15:27:23.842053    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.28 HWAddress:46:30:e8:d5:b4:79 ID:1,46:30:e8:d5:b4:79 Lease:0x6582244f}
	I1218 15:27:23.842063    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.27 HWAddress:92:b7:f6:53:5e:35 ID:1,92:b7:f6:53:5e:35 Lease:0x6580d2f7}
	I1218 15:27:23.842071    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.26 HWAddress:76:7f:aa:f1:b6:93 ID:1,76:7f:aa:f1:b6:93 Lease:0x6582230f}
	I1218 15:27:23.842080    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:e:d0:af:3b:93:af ID:1,e:d0:af:3b:93:af Lease:0x658222da}
	I1218 15:27:23.842093    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:7e:bb:25:ab:d4:54 ID:1,7e:bb:25:ab:d4:54 Lease:0x658222cd}
	I1218 15:27:23.842119    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:96:7b:cc:a9:f5:5a ID:1,96:7b:cc:a9:f5:5a Lease:0x6580d150}
	I1218 15:27:23.842132    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:6e:2c:bf:e5:d4:c8 ID:1,6e:2c:bf:e5:d4:c8 Lease:0x6580d142}
	I1218 15:27:23.842141    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:36:d8:32:a:3c:31 ID:1,36:d8:32:a:3c:31 Lease:0x65822280}
	I1218 15:27:23.842150    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:62:20:33:b0:56:e5 ID:1,62:20:33:b0:56:e5 Lease:0x65822268}
	I1218 15:27:23.842175    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:6e:47:1e:5f:3e:2 ID:1,6e:47:1e:5f:3e:2 Lease:0x658221f9}
	I1218 15:27:23.842189    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ae:34:98:ff:97:8d ID:1,ae:34:98:ff:97:8d Lease:0x6582218f}
	I1218 15:27:23.842197    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:ce:68:21:fe:24:bd ID:1,ce:68:21:fe:24:bd Lease:0x65822141}
	I1218 15:27:23.842207    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:6e:45:2:96:10:89 ID:1,6e:45:2:96:10:89 Lease:0x658220ac}
	I1218 15:27:23.842215    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:90:9b:fb:6a:2d ID:1,66:90:9b:fb:6a:2d Lease:0x6580ceb2}
	I1218 15:27:23.842223    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:e6:18:c3:b6:10:29 ID:1,e6:18:c3:b6:10:29 Lease:0x65822080}
	I1218 15:27:23.842232    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:7e:6c:62:57:be:53 ID:1,7e:6c:62:57:be:53 Lease:0x6582204c}
	I1218 15:27:23.842242    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:a:e:6c:4:f9:97 ID:1,a:e:6c:4:f9:97 Lease:0x6580cd3c}
	I1218 15:27:23.842253    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:c2:cb:5b:6d:94:c5 ID:1,c2:cb:5b:6d:94:c5 Lease:0x6580cd26}
	I1218 15:27:23.842274    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:fa:44:a:25:8e:6b ID:1,fa:44:a:25:8e:6b Lease:0x65821e5d}
	I1218 15:27:23.842286    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:1a:fd:bf:11:b7:85 ID:1,1a:fd:bf:11:b7:85 Lease:0x65821e36}
	I1218 15:27:23.842294    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:26:1d:41:79:69:22 ID:1,26:1d:41:79:69:22 Lease:0x65821dfa}
	I1218 15:27:23.842309    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:32:57:7:64:35:d8 ID:1,32:57:7:64:35:d8 Lease:0x65821d79}
	I1218 15:27:23.842317    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:16:1a:98:4e:b8:45 ID:1,16:1a:98:4e:b8:45 Lease:0x65821d5d}
	I1218 15:27:23.842324    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:4e:ee:a9:b6:7e:85 ID:1,4e:ee:a9:b6:7e:85 Lease:0x65821c78}
	I1218 15:27:23.842333    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:3a:b1:4e:f8:70:be ID:1,3a:b1:4e:f8:70:be Lease:0x6580caec}
	I1218 15:27:23.842340    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:2e:85:f3:9a:3:e4 ID:1,2e:85:f3:9a:3:e4 Lease:0x65821b3b}
	I1218 15:27:23.842350    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name: IPAddress:192.169.0.2 HWAddress:82:2b:f6:5b:7f:bf ID:1,82:2b:f6:5b:7f:bf Lease:0x658219d1}
	I1218 15:27:24.117130    7401 cache.go:157] /Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/registry.k8s.io/kube-scheduler_v1.29.0-rc.2 exists
	I1218 15:27:24.117148    7401 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.29.0-rc.2" -> "/Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/registry.k8s.io/kube-scheduler_v1.29.0-rc.2" took 3.016649079s
	I1218 15:27:24.117159    7401 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.29.0-rc.2 -> /Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/registry.k8s.io/kube-scheduler_v1.29.0-rc.2 succeeded
	I1218 15:27:24.677134    7401 cache.go:157] /Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/registry.k8s.io/coredns/coredns_v1.11.1 exists
	I1218 15:27:24.677150    7401 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.11.1" -> "/Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/registry.k8s.io/coredns/coredns_v1.11.1" took 3.576523384s
	I1218 15:27:24.677158    7401 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.11.1 -> /Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/registry.k8s.io/coredns/coredns_v1.11.1 succeeded
	I1218 15:27:25.647809    7401 cache.go:157] /Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/registry.k8s.io/kube-proxy_v1.29.0-rc.2 exists
	I1218 15:27:25.647826    7401 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.29.0-rc.2" -> "/Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/registry.k8s.io/kube-proxy_v1.29.0-rc.2" took 4.547387107s
	I1218 15:27:25.647834    7401 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.29.0-rc.2 -> /Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/registry.k8s.io/kube-proxy_v1.29.0-rc.2 succeeded
	I1218 15:27:25.843269    7401 main.go:141] libmachine: (no-preload-994000) DBG | Attempt 2
	I1218 15:27:25.843290    7401 main.go:141] libmachine: (no-preload-994000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:27:25.843395    7401 main.go:141] libmachine: (no-preload-994000) DBG | hyperkit pid from json: 7438
	I1218 15:27:25.844180    7401 main.go:141] libmachine: (no-preload-994000) DBG | Searching for e2:61:5d:1b:8b:e in /var/db/dhcpd_leases ...
	I1218 15:27:25.844262    7401 main.go:141] libmachine: (no-preload-994000) DBG | Found 40 entries in /var/db/dhcpd_leases!
	I1218 15:27:25.844272    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.41 HWAddress:5a:bb:86:81:7e:3b ID:1,5a:bb:86:81:7e:3b Lease:0x65822679}
	I1218 15:27:25.844281    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.40 HWAddress:be:ae:7:4a:b1:e7 ID:1,be:ae:7:4a:b1:e7 Lease:0x6582266c}
	I1218 15:27:25.844288    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.39 HWAddress:e:b8:aa:f6:3d:6 ID:1,e:b8:aa:f6:3d:6 Lease:0x6582261f}
	I1218 15:27:25.844305    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.38 HWAddress:ea:97:1e:7c:d6:8 ID:1,ea:97:1e:7c:d6:8 Lease:0x65822611}
	I1218 15:27:25.844317    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.37 HWAddress:e6:c4:e5:78:db:a5 ID:1,e6:c4:e5:78:db:a5 Lease:0x658225cd}
	I1218 15:27:25.844326    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.36 HWAddress:a:27:22:b0:d3:1 ID:1,a:27:22:b0:d3:1 Lease:0x658225c1}
	I1218 15:27:25.844332    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.35 HWAddress:8e:cf:da:63:d8:fe ID:1,8e:cf:da:63:d8:fe Lease:0x65822569}
	I1218 15:27:25.844340    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.34 HWAddress:6a:31:a1:4d:bb:f5 ID:1,6a:31:a1:4d:bb:f5 Lease:0x65822553}
	I1218 15:27:25.844346    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.33 HWAddress:6:c6:98:eb:ea:2c ID:1,6:c6:98:eb:ea:2c Lease:0x6582250a}
	I1218 15:27:25.844367    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.32 HWAddress:3e:79:5c:d9:7a:33 ID:1,3e:79:5c:d9:7a:33 Lease:0x658224d9}
	I1218 15:27:25.844383    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.31 HWAddress:52:2e:d4:42:e7:3c ID:1,52:2e:d4:42:e7:3c Lease:0x6580d37e}
	I1218 15:27:25.844394    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.30 HWAddress:ba:66:f5:b8:78:5d ID:1,ba:66:f5:b8:78:5d Lease:0x6580d339}
	I1218 15:27:25.844402    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.29 HWAddress:da:74:f2:6a:f8:f1 ID:1,da:74:f2:6a:f8:f1 Lease:0x65822482}
	I1218 15:27:25.844409    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.28 HWAddress:46:30:e8:d5:b4:79 ID:1,46:30:e8:d5:b4:79 Lease:0x6582244f}
	I1218 15:27:25.844418    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.27 HWAddress:92:b7:f6:53:5e:35 ID:1,92:b7:f6:53:5e:35 Lease:0x6580d2f7}
	I1218 15:27:25.844426    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.26 HWAddress:76:7f:aa:f1:b6:93 ID:1,76:7f:aa:f1:b6:93 Lease:0x6582230f}
	I1218 15:27:25.844438    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:e:d0:af:3b:93:af ID:1,e:d0:af:3b:93:af Lease:0x658222da}
	I1218 15:27:25.844453    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:7e:bb:25:ab:d4:54 ID:1,7e:bb:25:ab:d4:54 Lease:0x658222cd}
	I1218 15:27:25.844462    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:96:7b:cc:a9:f5:5a ID:1,96:7b:cc:a9:f5:5a Lease:0x6580d150}
	I1218 15:27:25.844470    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:6e:2c:bf:e5:d4:c8 ID:1,6e:2c:bf:e5:d4:c8 Lease:0x6580d142}
	I1218 15:27:25.844479    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:36:d8:32:a:3c:31 ID:1,36:d8:32:a:3c:31 Lease:0x65822280}
	I1218 15:27:25.844487    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:62:20:33:b0:56:e5 ID:1,62:20:33:b0:56:e5 Lease:0x65822268}
	I1218 15:27:25.844495    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:6e:47:1e:5f:3e:2 ID:1,6e:47:1e:5f:3e:2 Lease:0x658221f9}
	I1218 15:27:25.844503    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ae:34:98:ff:97:8d ID:1,ae:34:98:ff:97:8d Lease:0x6582218f}
	I1218 15:27:25.844512    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:ce:68:21:fe:24:bd ID:1,ce:68:21:fe:24:bd Lease:0x65822141}
	I1218 15:27:25.844524    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:6e:45:2:96:10:89 ID:1,6e:45:2:96:10:89 Lease:0x658220ac}
	I1218 15:27:25.844536    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:90:9b:fb:6a:2d ID:1,66:90:9b:fb:6a:2d Lease:0x6580ceb2}
	I1218 15:27:25.844544    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:e6:18:c3:b6:10:29 ID:1,e6:18:c3:b6:10:29 Lease:0x65822080}
	I1218 15:27:25.844555    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:7e:6c:62:57:be:53 ID:1,7e:6c:62:57:be:53 Lease:0x6582204c}
	I1218 15:27:25.844563    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:a:e:6c:4:f9:97 ID:1,a:e:6c:4:f9:97 Lease:0x6580cd3c}
	I1218 15:27:25.844573    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:c2:cb:5b:6d:94:c5 ID:1,c2:cb:5b:6d:94:c5 Lease:0x6580cd26}
	I1218 15:27:25.844583    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:fa:44:a:25:8e:6b ID:1,fa:44:a:25:8e:6b Lease:0x65821e5d}
	I1218 15:27:25.844593    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:1a:fd:bf:11:b7:85 ID:1,1a:fd:bf:11:b7:85 Lease:0x65821e36}
	I1218 15:27:25.844603    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:26:1d:41:79:69:22 ID:1,26:1d:41:79:69:22 Lease:0x65821dfa}
	I1218 15:27:25.844611    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:32:57:7:64:35:d8 ID:1,32:57:7:64:35:d8 Lease:0x65821d79}
	I1218 15:27:25.844619    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:16:1a:98:4e:b8:45 ID:1,16:1a:98:4e:b8:45 Lease:0x65821d5d}
	I1218 15:27:25.844628    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:4e:ee:a9:b6:7e:85 ID:1,4e:ee:a9:b6:7e:85 Lease:0x65821c78}
	I1218 15:27:25.844635    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:3a:b1:4e:f8:70:be ID:1,3a:b1:4e:f8:70:be Lease:0x6580caec}
	I1218 15:27:25.844644    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:2e:85:f3:9a:3:e4 ID:1,2e:85:f3:9a:3:e4 Lease:0x65821b3b}
	I1218 15:27:25.844660    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name: IPAddress:192.169.0.2 HWAddress:82:2b:f6:5b:7f:bf ID:1,82:2b:f6:5b:7f:bf Lease:0x658219d1}
	I1218 15:27:27.293081    7401 main.go:141] libmachine: (no-preload-994000) DBG | 2023/12/18 15:27:27 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I1218 15:27:27.293193    7401 main.go:141] libmachine: (no-preload-994000) DBG | 2023/12/18 15:27:27 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I1218 15:27:27.293204    7401 main.go:141] libmachine: (no-preload-994000) DBG | 2023/12/18 15:27:27 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I1218 15:27:27.391090    7401 cache.go:157] /Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/registry.k8s.io/kube-apiserver_v1.29.0-rc.2 exists
	I1218 15:27:27.391116    7401 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.29.0-rc.2" -> "/Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/registry.k8s.io/kube-apiserver_v1.29.0-rc.2" took 6.290560235s
	I1218 15:27:27.391125    7401 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.29.0-rc.2 -> /Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/registry.k8s.io/kube-apiserver_v1.29.0-rc.2 succeeded
	I1218 15:27:27.501928    7401 cache.go:157] /Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/registry.k8s.io/kube-controller-manager_v1.29.0-rc.2 exists
	I1218 15:27:27.501948    7401 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.29.0-rc.2" -> "/Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/registry.k8s.io/kube-controller-manager_v1.29.0-rc.2" took 6.401457613s
	I1218 15:27:27.501957    7401 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.29.0-rc.2 -> /Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/registry.k8s.io/kube-controller-manager_v1.29.0-rc.2 succeeded
	I1218 15:27:27.844858    7401 main.go:141] libmachine: (no-preload-994000) DBG | Attempt 3
	I1218 15:27:27.844876    7401 main.go:141] libmachine: (no-preload-994000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:27:27.844992    7401 main.go:141] libmachine: (no-preload-994000) DBG | hyperkit pid from json: 7438
	I1218 15:27:27.845808    7401 main.go:141] libmachine: (no-preload-994000) DBG | Searching for e2:61:5d:1b:8b:e in /var/db/dhcpd_leases ...
	I1218 15:27:27.846008    7401 main.go:141] libmachine: (no-preload-994000) DBG | Found 40 entries in /var/db/dhcpd_leases!
	I1218 15:27:27.846027    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.41 HWAddress:5a:bb:86:81:7e:3b ID:1,5a:bb:86:81:7e:3b Lease:0x65822679}
	I1218 15:27:27.846036    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.40 HWAddress:be:ae:7:4a:b1:e7 ID:1,be:ae:7:4a:b1:e7 Lease:0x6582266c}
	I1218 15:27:27.846044    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.39 HWAddress:e:b8:aa:f6:3d:6 ID:1,e:b8:aa:f6:3d:6 Lease:0x6582261f}
	I1218 15:27:27.846052    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.38 HWAddress:ea:97:1e:7c:d6:8 ID:1,ea:97:1e:7c:d6:8 Lease:0x65822611}
	I1218 15:27:27.846059    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.37 HWAddress:e6:c4:e5:78:db:a5 ID:1,e6:c4:e5:78:db:a5 Lease:0x658225cd}
	I1218 15:27:27.846068    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.36 HWAddress:a:27:22:b0:d3:1 ID:1,a:27:22:b0:d3:1 Lease:0x658225c1}
	I1218 15:27:27.846086    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.35 HWAddress:8e:cf:da:63:d8:fe ID:1,8e:cf:da:63:d8:fe Lease:0x65822569}
	I1218 15:27:27.846099    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.34 HWAddress:6a:31:a1:4d:bb:f5 ID:1,6a:31:a1:4d:bb:f5 Lease:0x65822553}
	I1218 15:27:27.846119    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.33 HWAddress:6:c6:98:eb:ea:2c ID:1,6:c6:98:eb:ea:2c Lease:0x6582250a}
	I1218 15:27:27.846127    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.32 HWAddress:3e:79:5c:d9:7a:33 ID:1,3e:79:5c:d9:7a:33 Lease:0x658224d9}
	I1218 15:27:27.846135    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.31 HWAddress:52:2e:d4:42:e7:3c ID:1,52:2e:d4:42:e7:3c Lease:0x6580d37e}
	I1218 15:27:27.846143    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.30 HWAddress:ba:66:f5:b8:78:5d ID:1,ba:66:f5:b8:78:5d Lease:0x6580d339}
	I1218 15:27:27.846152    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.29 HWAddress:da:74:f2:6a:f8:f1 ID:1,da:74:f2:6a:f8:f1 Lease:0x65822482}
	I1218 15:27:27.846158    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.28 HWAddress:46:30:e8:d5:b4:79 ID:1,46:30:e8:d5:b4:79 Lease:0x6582244f}
	I1218 15:27:27.846167    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.27 HWAddress:92:b7:f6:53:5e:35 ID:1,92:b7:f6:53:5e:35 Lease:0x6580d2f7}
	I1218 15:27:27.846174    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.26 HWAddress:76:7f:aa:f1:b6:93 ID:1,76:7f:aa:f1:b6:93 Lease:0x6582230f}
	I1218 15:27:27.846181    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:e:d0:af:3b:93:af ID:1,e:d0:af:3b:93:af Lease:0x658222da}
	I1218 15:27:27.846188    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:7e:bb:25:ab:d4:54 ID:1,7e:bb:25:ab:d4:54 Lease:0x658222cd}
	I1218 15:27:27.846197    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:96:7b:cc:a9:f5:5a ID:1,96:7b:cc:a9:f5:5a Lease:0x6580d150}
	I1218 15:27:27.846204    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:6e:2c:bf:e5:d4:c8 ID:1,6e:2c:bf:e5:d4:c8 Lease:0x6580d142}
	I1218 15:27:27.846213    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:36:d8:32:a:3c:31 ID:1,36:d8:32:a:3c:31 Lease:0x65822280}
	I1218 15:27:27.846221    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:62:20:33:b0:56:e5 ID:1,62:20:33:b0:56:e5 Lease:0x65822268}
	I1218 15:27:27.846229    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:6e:47:1e:5f:3e:2 ID:1,6e:47:1e:5f:3e:2 Lease:0x658221f9}
	I1218 15:27:27.846236    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ae:34:98:ff:97:8d ID:1,ae:34:98:ff:97:8d Lease:0x6582218f}
	I1218 15:27:27.846243    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:ce:68:21:fe:24:bd ID:1,ce:68:21:fe:24:bd Lease:0x65822141}
	I1218 15:27:27.846273    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:6e:45:2:96:10:89 ID:1,6e:45:2:96:10:89 Lease:0x658220ac}
	I1218 15:27:27.846290    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:90:9b:fb:6a:2d ID:1,66:90:9b:fb:6a:2d Lease:0x6580ceb2}
	I1218 15:27:27.846304    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:e6:18:c3:b6:10:29 ID:1,e6:18:c3:b6:10:29 Lease:0x65822080}
	I1218 15:27:27.846313    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:7e:6c:62:57:be:53 ID:1,7e:6c:62:57:be:53 Lease:0x6582204c}
	I1218 15:27:27.846322    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:a:e:6c:4:f9:97 ID:1,a:e:6c:4:f9:97 Lease:0x6580cd3c}
	I1218 15:27:27.846331    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:c2:cb:5b:6d:94:c5 ID:1,c2:cb:5b:6d:94:c5 Lease:0x6580cd26}
	I1218 15:27:27.846340    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:fa:44:a:25:8e:6b ID:1,fa:44:a:25:8e:6b Lease:0x65821e5d}
	I1218 15:27:27.846348    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:1a:fd:bf:11:b7:85 ID:1,1a:fd:bf:11:b7:85 Lease:0x65821e36}
	I1218 15:27:27.846356    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:26:1d:41:79:69:22 ID:1,26:1d:41:79:69:22 Lease:0x65821dfa}
	I1218 15:27:27.846363    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:32:57:7:64:35:d8 ID:1,32:57:7:64:35:d8 Lease:0x65821d79}
	I1218 15:27:27.846375    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:16:1a:98:4e:b8:45 ID:1,16:1a:98:4e:b8:45 Lease:0x65821d5d}
	I1218 15:27:27.846384    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:4e:ee:a9:b6:7e:85 ID:1,4e:ee:a9:b6:7e:85 Lease:0x65821c78}
	I1218 15:27:27.846392    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:3a:b1:4e:f8:70:be ID:1,3a:b1:4e:f8:70:be Lease:0x6580caec}
	I1218 15:27:27.846402    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:2e:85:f3:9a:3:e4 ID:1,2e:85:f3:9a:3:e4 Lease:0x65821b3b}
	I1218 15:27:27.846410    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name: IPAddress:192.169.0.2 HWAddress:82:2b:f6:5b:7f:bf ID:1,82:2b:f6:5b:7f:bf Lease:0x658219d1}
	I1218 15:27:29.410120    7401 cache.go:157] /Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/registry.k8s.io/etcd_3.5.10-0 exists
	I1218 15:27:29.410137    7401 cache.go:96] cache image "registry.k8s.io/etcd:3.5.10-0" -> "/Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/registry.k8s.io/etcd_3.5.10-0" took 8.309497572s
	I1218 15:27:29.410145    7401 cache.go:80] save to tar file registry.k8s.io/etcd:3.5.10-0 -> /Users/jenkins/minikube-integration/17822-999/.minikube/cache/images/amd64/registry.k8s.io/etcd_3.5.10-0 succeeded
	I1218 15:27:29.410163    7401 cache.go:87] Successfully saved all images to host disk.
	I1218 15:27:29.847478    7401 main.go:141] libmachine: (no-preload-994000) DBG | Attempt 4
	I1218 15:27:29.847495    7401 main.go:141] libmachine: (no-preload-994000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:27:29.847599    7401 main.go:141] libmachine: (no-preload-994000) DBG | hyperkit pid from json: 7438
	I1218 15:27:29.848442    7401 main.go:141] libmachine: (no-preload-994000) DBG | Searching for e2:61:5d:1b:8b:e in /var/db/dhcpd_leases ...
	I1218 15:27:29.848515    7401 main.go:141] libmachine: (no-preload-994000) DBG | Found 40 entries in /var/db/dhcpd_leases!
	I1218 15:27:29.848526    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.41 HWAddress:5a:bb:86:81:7e:3b ID:1,5a:bb:86:81:7e:3b Lease:0x65822679}
	I1218 15:27:29.848538    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.40 HWAddress:be:ae:7:4a:b1:e7 ID:1,be:ae:7:4a:b1:e7 Lease:0x6582266c}
	I1218 15:27:29.848547    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.39 HWAddress:e:b8:aa:f6:3d:6 ID:1,e:b8:aa:f6:3d:6 Lease:0x6582261f}
	I1218 15:27:29.848565    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.38 HWAddress:ea:97:1e:7c:d6:8 ID:1,ea:97:1e:7c:d6:8 Lease:0x65822611}
	I1218 15:27:29.848588    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.37 HWAddress:e6:c4:e5:78:db:a5 ID:1,e6:c4:e5:78:db:a5 Lease:0x658225cd}
	I1218 15:27:29.848608    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.36 HWAddress:a:27:22:b0:d3:1 ID:1,a:27:22:b0:d3:1 Lease:0x658225c1}
	I1218 15:27:29.848622    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.35 HWAddress:8e:cf:da:63:d8:fe ID:1,8e:cf:da:63:d8:fe Lease:0x65822569}
	I1218 15:27:29.848649    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.34 HWAddress:6a:31:a1:4d:bb:f5 ID:1,6a:31:a1:4d:bb:f5 Lease:0x65822553}
	I1218 15:27:29.848662    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.33 HWAddress:6:c6:98:eb:ea:2c ID:1,6:c6:98:eb:ea:2c Lease:0x6582250a}
	I1218 15:27:29.848673    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.32 HWAddress:3e:79:5c:d9:7a:33 ID:1,3e:79:5c:d9:7a:33 Lease:0x658224d9}
	I1218 15:27:29.848681    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.31 HWAddress:52:2e:d4:42:e7:3c ID:1,52:2e:d4:42:e7:3c Lease:0x6580d37e}
	I1218 15:27:29.848688    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.30 HWAddress:ba:66:f5:b8:78:5d ID:1,ba:66:f5:b8:78:5d Lease:0x6580d339}
	I1218 15:27:29.848704    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.29 HWAddress:da:74:f2:6a:f8:f1 ID:1,da:74:f2:6a:f8:f1 Lease:0x65822482}
	I1218 15:27:29.848718    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.28 HWAddress:46:30:e8:d5:b4:79 ID:1,46:30:e8:d5:b4:79 Lease:0x6582244f}
	I1218 15:27:29.848730    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.27 HWAddress:92:b7:f6:53:5e:35 ID:1,92:b7:f6:53:5e:35 Lease:0x6580d2f7}
	I1218 15:27:29.848739    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.26 HWAddress:76:7f:aa:f1:b6:93 ID:1,76:7f:aa:f1:b6:93 Lease:0x6582230f}
	I1218 15:27:29.848750    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:e:d0:af:3b:93:af ID:1,e:d0:af:3b:93:af Lease:0x658222da}
	I1218 15:27:29.848757    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:7e:bb:25:ab:d4:54 ID:1,7e:bb:25:ab:d4:54 Lease:0x658222cd}
	I1218 15:27:29.848765    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:96:7b:cc:a9:f5:5a ID:1,96:7b:cc:a9:f5:5a Lease:0x6580d150}
	I1218 15:27:29.848772    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:6e:2c:bf:e5:d4:c8 ID:1,6e:2c:bf:e5:d4:c8 Lease:0x6580d142}
	I1218 15:27:29.848778    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:36:d8:32:a:3c:31 ID:1,36:d8:32:a:3c:31 Lease:0x65822280}
	I1218 15:27:29.848795    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:62:20:33:b0:56:e5 ID:1,62:20:33:b0:56:e5 Lease:0x65822268}
	I1218 15:27:29.848810    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:6e:47:1e:5f:3e:2 ID:1,6e:47:1e:5f:3e:2 Lease:0x658221f9}
	I1218 15:27:29.848819    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ae:34:98:ff:97:8d ID:1,ae:34:98:ff:97:8d Lease:0x6582218f}
	I1218 15:27:29.848829    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:ce:68:21:fe:24:bd ID:1,ce:68:21:fe:24:bd Lease:0x65822141}
	I1218 15:27:29.848842    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:6e:45:2:96:10:89 ID:1,6e:45:2:96:10:89 Lease:0x658220ac}
	I1218 15:27:29.848855    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:90:9b:fb:6a:2d ID:1,66:90:9b:fb:6a:2d Lease:0x6580ceb2}
	I1218 15:27:29.848864    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:e6:18:c3:b6:10:29 ID:1,e6:18:c3:b6:10:29 Lease:0x65822080}
	I1218 15:27:29.848870    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:7e:6c:62:57:be:53 ID:1,7e:6c:62:57:be:53 Lease:0x6582204c}
	I1218 15:27:29.848878    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:a:e:6c:4:f9:97 ID:1,a:e:6c:4:f9:97 Lease:0x6580cd3c}
	I1218 15:27:29.848892    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:c2:cb:5b:6d:94:c5 ID:1,c2:cb:5b:6d:94:c5 Lease:0x6580cd26}
	I1218 15:27:29.848904    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:fa:44:a:25:8e:6b ID:1,fa:44:a:25:8e:6b Lease:0x65821e5d}
	I1218 15:27:29.848913    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:1a:fd:bf:11:b7:85 ID:1,1a:fd:bf:11:b7:85 Lease:0x65821e36}
	I1218 15:27:29.848921    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:26:1d:41:79:69:22 ID:1,26:1d:41:79:69:22 Lease:0x65821dfa}
	I1218 15:27:29.848928    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:32:57:7:64:35:d8 ID:1,32:57:7:64:35:d8 Lease:0x65821d79}
	I1218 15:27:29.848947    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:16:1a:98:4e:b8:45 ID:1,16:1a:98:4e:b8:45 Lease:0x65821d5d}
	I1218 15:27:29.848966    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:4e:ee:a9:b6:7e:85 ID:1,4e:ee:a9:b6:7e:85 Lease:0x65821c78}
	I1218 15:27:29.848983    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:3a:b1:4e:f8:70:be ID:1,3a:b1:4e:f8:70:be Lease:0x6580caec}
	I1218 15:27:29.848996    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:2e:85:f3:9a:3:e4 ID:1,2e:85:f3:9a:3:e4 Lease:0x65821b3b}
	I1218 15:27:29.849014    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name: IPAddress:192.169.0.2 HWAddress:82:2b:f6:5b:7f:bf ID:1,82:2b:f6:5b:7f:bf Lease:0x658219d1}
	I1218 15:27:31.849270    7401 main.go:141] libmachine: (no-preload-994000) DBG | Attempt 5
	I1218 15:27:31.849313    7401 main.go:141] libmachine: (no-preload-994000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:27:31.849525    7401 main.go:141] libmachine: (no-preload-994000) DBG | hyperkit pid from json: 7438
	I1218 15:27:31.851048    7401 main.go:141] libmachine: (no-preload-994000) DBG | Searching for e2:61:5d:1b:8b:e in /var/db/dhcpd_leases ...
	I1218 15:27:31.851176    7401 main.go:141] libmachine: (no-preload-994000) DBG | Found 41 entries in /var/db/dhcpd_leases!
	I1218 15:27:31.851200    7401 main.go:141] libmachine: (no-preload-994000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.42 HWAddress:e2:61:5d:1b:8b:e ID:1,e2:61:5d:1b:8b:e Lease:0x658226e2}
	I1218 15:27:31.851218    7401 main.go:141] libmachine: (no-preload-994000) DBG | Found match: e2:61:5d:1b:8b:e
	I1218 15:27:31.851252    7401 main.go:141] libmachine: (no-preload-994000) DBG | IP: 192.169.0.42
	I1218 15:27:31.851341    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetConfigRaw
	I1218 15:27:31.852229    7401 main.go:141] libmachine: (no-preload-994000) Calling .DriverName
	I1218 15:27:31.852399    7401 main.go:141] libmachine: (no-preload-994000) Calling .DriverName
	I1218 15:27:31.852555    7401 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I1218 15:27:31.852574    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetState
	I1218 15:27:31.852705    7401 main.go:141] libmachine: (no-preload-994000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:27:31.852787    7401 main.go:141] libmachine: (no-preload-994000) DBG | hyperkit pid from json: 7438
	I1218 15:27:31.853739    7401 main.go:141] libmachine: Detecting operating system of created instance...
	I1218 15:27:31.853750    7401 main.go:141] libmachine: Waiting for SSH to be available...
	I1218 15:27:31.853755    7401 main.go:141] libmachine: Getting to WaitForSSH function...
	I1218 15:27:31.853763    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHHostname
	I1218 15:27:31.853857    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHPort
	I1218 15:27:31.853963    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHKeyPath
	I1218 15:27:31.854051    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHKeyPath
	I1218 15:27:31.854134    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHUsername
	I1218 15:27:31.854239    7401 main.go:141] libmachine: Using SSH client type: native
	I1218 15:27:31.854543    7401 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1406660] 0x1409340 <nil>  [] 0s} 192.169.0.42 22 <nil> <nil>}
	I1218 15:27:31.854551    7401 main.go:141] libmachine: About to run SSH command:
	exit 0
	I1218 15:27:31.914766    7401 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I1218 15:27:31.914778    7401 main.go:141] libmachine: Detecting the provisioner...
	I1218 15:27:31.914785    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHHostname
	I1218 15:27:31.914909    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHPort
	I1218 15:27:31.915027    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHKeyPath
	I1218 15:27:31.915114    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHKeyPath
	I1218 15:27:31.915210    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHUsername
	I1218 15:27:31.915339    7401 main.go:141] libmachine: Using SSH client type: native
	I1218 15:27:31.915591    7401 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1406660] 0x1409340 <nil>  [] 0s} 192.169.0.42 22 <nil> <nil>}
	I1218 15:27:31.915604    7401 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I1218 15:27:31.976861    7401 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2021.02.12-1-gae27a7b-dirty
	ID=buildroot
	VERSION_ID=2021.02.12
	PRETTY_NAME="Buildroot 2021.02.12"
	
	I1218 15:27:31.976920    7401 main.go:141] libmachine: found compatible host: buildroot
	I1218 15:27:31.976926    7401 main.go:141] libmachine: Provisioning with buildroot...
	I1218 15:27:31.976933    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetMachineName
	I1218 15:27:31.977071    7401 buildroot.go:166] provisioning hostname "no-preload-994000"
	I1218 15:27:31.977089    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetMachineName
	I1218 15:27:31.977186    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHHostname
	I1218 15:27:31.977283    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHPort
	I1218 15:27:31.977372    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHKeyPath
	I1218 15:27:31.977454    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHKeyPath
	I1218 15:27:31.977534    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHUsername
	I1218 15:27:31.977668    7401 main.go:141] libmachine: Using SSH client type: native
	I1218 15:27:31.977920    7401 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1406660] 0x1409340 <nil>  [] 0s} 192.169.0.42 22 <nil> <nil>}
	I1218 15:27:31.977929    7401 main.go:141] libmachine: About to run SSH command:
	sudo hostname no-preload-994000 && echo "no-preload-994000" | sudo tee /etc/hostname
	I1218 15:27:32.049006    7401 main.go:141] libmachine: SSH cmd err, output: <nil>: no-preload-994000
	
	I1218 15:27:32.049026    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHHostname
	I1218 15:27:32.049161    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHPort
	I1218 15:27:32.049262    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHKeyPath
	I1218 15:27:32.049356    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHKeyPath
	I1218 15:27:32.049457    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHUsername
	I1218 15:27:32.049587    7401 main.go:141] libmachine: Using SSH client type: native
	I1218 15:27:32.049839    7401 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1406660] 0x1409340 <nil>  [] 0s} 192.169.0.42 22 <nil> <nil>}
	I1218 15:27:32.049851    7401 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sno-preload-994000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 no-preload-994000/g' /etc/hosts;
				else 
					echo '127.0.1.1 no-preload-994000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1218 15:27:32.117251    7401 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I1218 15:27:32.117270    7401 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/17822-999/.minikube CaCertPath:/Users/jenkins/minikube-integration/17822-999/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/17822-999/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/17822-999/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/17822-999/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/17822-999/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/17822-999/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/17822-999/.minikube}
	I1218 15:27:32.117282    7401 buildroot.go:174] setting up certificates
	I1218 15:27:32.117294    7401 provision.go:83] configureAuth start
	I1218 15:27:32.117302    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetMachineName
	I1218 15:27:32.117426    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetIP
	I1218 15:27:32.117547    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHHostname
	I1218 15:27:32.117642    7401 provision.go:138] copyHostCerts
	I1218 15:27:32.117731    7401 exec_runner.go:144] found /Users/jenkins/minikube-integration/17822-999/.minikube/cert.pem, removing ...
	I1218 15:27:32.117741    7401 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/17822-999/.minikube/cert.pem
	I1218 15:27:32.117872    7401 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/17822-999/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/17822-999/.minikube/cert.pem (1123 bytes)
	I1218 15:27:32.118111    7401 exec_runner.go:144] found /Users/jenkins/minikube-integration/17822-999/.minikube/key.pem, removing ...
	I1218 15:27:32.118118    7401 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/17822-999/.minikube/key.pem
	I1218 15:27:32.118202    7401 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/17822-999/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/17822-999/.minikube/key.pem (1675 bytes)
	I1218 15:27:32.118378    7401 exec_runner.go:144] found /Users/jenkins/minikube-integration/17822-999/.minikube/ca.pem, removing ...
	I1218 15:27:32.118383    7401 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/17822-999/.minikube/ca.pem
	I1218 15:27:32.118457    7401 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/17822-999/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/17822-999/.minikube/ca.pem (1082 bytes)
	I1218 15:27:32.118605    7401 provision.go:112] generating server cert: /Users/jenkins/minikube-integration/17822-999/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/17822-999/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/17822-999/.minikube/certs/ca-key.pem org=jenkins.no-preload-994000 san=[192.169.0.42 192.169.0.42 localhost 127.0.0.1 minikube no-preload-994000]
	I1218 15:27:32.155500    7401 provision.go:172] copyRemoteCerts
	I1218 15:27:32.155552    7401 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1218 15:27:32.155570    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHHostname
	I1218 15:27:32.155729    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHPort
	I1218 15:27:32.155847    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHKeyPath
	I1218 15:27:32.155941    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHUsername
	I1218 15:27:32.156041    7401 sshutil.go:53] new ssh client: &{IP:192.169.0.42 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17822-999/.minikube/machines/no-preload-994000/id_rsa Username:docker}
	I1218 15:27:32.191620    7401 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17822-999/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I1218 15:27:32.207862    7401 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17822-999/.minikube/machines/server.pem --> /etc/docker/server.pem (1229 bytes)
	I1218 15:27:32.223917    7401 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17822-999/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1218 15:27:32.239707    7401 provision.go:86] duration metric: configureAuth took 122.398143ms
	I1218 15:27:32.239719    7401 buildroot.go:189] setting minikube options for container-runtime
	I1218 15:27:32.239854    7401 config.go:182] Loaded profile config "no-preload-994000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.29.0-rc.2
	I1218 15:27:32.239867    7401 main.go:141] libmachine: (no-preload-994000) Calling .DriverName
	I1218 15:27:32.240002    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHHostname
	I1218 15:27:32.240097    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHPort
	I1218 15:27:32.240178    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHKeyPath
	I1218 15:27:32.240262    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHKeyPath
	I1218 15:27:32.240348    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHUsername
	I1218 15:27:32.240476    7401 main.go:141] libmachine: Using SSH client type: native
	I1218 15:27:32.240715    7401 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1406660] 0x1409340 <nil>  [] 0s} 192.169.0.42 22 <nil> <nil>}
	I1218 15:27:32.240723    7401 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I1218 15:27:32.302818    7401 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I1218 15:27:32.302829    7401 buildroot.go:70] root file system type: tmpfs
	I1218 15:27:32.302899    7401 provision.go:309] Updating docker unit: /lib/systemd/system/docker.service ...
	I1218 15:27:32.302912    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHHostname
	I1218 15:27:32.303033    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHPort
	I1218 15:27:32.303114    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHKeyPath
	I1218 15:27:32.303194    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHKeyPath
	I1218 15:27:32.303267    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHUsername
	I1218 15:27:32.303392    7401 main.go:141] libmachine: Using SSH client type: native
	I1218 15:27:32.303636    7401 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1406660] 0x1409340 <nil>  [] 0s} 192.169.0.42 22 <nil> <nil>}
	I1218 15:27:32.303682    7401 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I1218 15:27:32.374180    7401 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I1218 15:27:32.374201    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHHostname
	I1218 15:27:32.374352    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHPort
	I1218 15:27:32.374453    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHKeyPath
	I1218 15:27:32.374546    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHKeyPath
	I1218 15:27:32.374632    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHUsername
	I1218 15:27:32.374767    7401 main.go:141] libmachine: Using SSH client type: native
	I1218 15:27:32.375009    7401 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1406660] 0x1409340 <nil>  [] 0s} 192.169.0.42 22 <nil> <nil>}
	I1218 15:27:32.375021    7401 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I1218 15:27:32.837509    7401 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I1218 15:27:32.837523    7401 main.go:141] libmachine: Checking connection to Docker...
	I1218 15:27:32.837531    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetURL
	I1218 15:27:32.837666    7401 main.go:141] libmachine: Docker is up and running!
	I1218 15:27:32.837672    7401 main.go:141] libmachine: Reticulating splines...
	I1218 15:27:32.837677    7401 client.go:171] LocalClient.Create took 11.701939706s
	I1218 15:27:32.837695    7401 start.go:167] duration metric: libmachine.API.Create for "no-preload-994000" took 11.701994672s
	I1218 15:27:32.837705    7401 start.go:300] post-start starting for "no-preload-994000" (driver="hyperkit")
	I1218 15:27:32.837716    7401 start.go:329] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1218 15:27:32.837728    7401 main.go:141] libmachine: (no-preload-994000) Calling .DriverName
	I1218 15:27:32.837871    7401 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1218 15:27:32.837885    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHHostname
	I1218 15:27:32.837974    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHPort
	I1218 15:27:32.838053    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHKeyPath
	I1218 15:27:32.838139    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHUsername
	I1218 15:27:32.838221    7401 sshutil.go:53] new ssh client: &{IP:192.169.0.42 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17822-999/.minikube/machines/no-preload-994000/id_rsa Username:docker}
	I1218 15:27:32.875391    7401 ssh_runner.go:195] Run: cat /etc/os-release
	I1218 15:27:32.878048    7401 info.go:137] Remote host: Buildroot 2021.02.12
	I1218 15:27:32.878061    7401 filesync.go:126] Scanning /Users/jenkins/minikube-integration/17822-999/.minikube/addons for local assets ...
	I1218 15:27:32.878149    7401 filesync.go:126] Scanning /Users/jenkins/minikube-integration/17822-999/.minikube/files for local assets ...
	I1218 15:27:32.878330    7401 filesync.go:149] local asset: /Users/jenkins/minikube-integration/17822-999/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I1218 15:27:32.878538    7401 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I1218 15:27:32.884812    7401 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17822-999/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I1218 15:27:32.900216    7401 start.go:303] post-start completed in 62.501232ms
	I1218 15:27:32.900238    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetConfigRaw
	I1218 15:27:32.900805    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetIP
	I1218 15:27:32.900953    7401 profile.go:148] Saving config to /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/no-preload-994000/config.json ...
	I1218 15:27:32.901287    7401 start.go:128] duration metric: createHost completed in 11.799506964s
	I1218 15:27:32.901306    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHHostname
	I1218 15:27:32.901412    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHPort
	I1218 15:27:32.901496    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHKeyPath
	I1218 15:27:32.901576    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHKeyPath
	I1218 15:27:32.901654    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHUsername
	I1218 15:27:32.901780    7401 main.go:141] libmachine: Using SSH client type: native
	I1218 15:27:32.902022    7401 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1406660] 0x1409340 <nil>  [] 0s} 192.169.0.42 22 <nil> <nil>}
	I1218 15:27:32.902030    7401 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I1218 15:27:32.963690    7401 main.go:141] libmachine: SSH cmd err, output: <nil>: 1702942052.779411130
	
	I1218 15:27:32.963704    7401 fix.go:206] guest clock: 1702942052.779411130
	I1218 15:27:32.963711    7401 fix.go:219] Guest: 2023-12-18 15:27:32.77941113 -0800 PST Remote: 2023-12-18 15:27:32.901299 -0800 PST m=+12.253641755 (delta=-121.88787ms)
	I1218 15:27:32.963730    7401 fix.go:190] guest clock delta is within tolerance: -121.88787ms
	I1218 15:27:32.963739    7401 start.go:83] releasing machines lock for "no-preload-994000", held for 11.862124435s
	I1218 15:27:32.963760    7401 main.go:141] libmachine: (no-preload-994000) Calling .DriverName
	I1218 15:27:32.963896    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetIP
	I1218 15:27:32.963992    7401 main.go:141] libmachine: (no-preload-994000) Calling .DriverName
	I1218 15:27:32.964290    7401 main.go:141] libmachine: (no-preload-994000) Calling .DriverName
	I1218 15:27:32.964387    7401 main.go:141] libmachine: (no-preload-994000) Calling .DriverName
	I1218 15:27:32.964455    7401 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1218 15:27:32.964480    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHHostname
	I1218 15:27:32.964537    7401 ssh_runner.go:195] Run: cat /version.json
	I1218 15:27:32.964555    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHHostname
	I1218 15:27:32.964595    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHPort
	I1218 15:27:32.964674    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHPort
	I1218 15:27:32.964682    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHKeyPath
	I1218 15:27:32.964782    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHKeyPath
	I1218 15:27:32.964795    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHUsername
	I1218 15:27:32.964875    7401 main.go:141] libmachine: (no-preload-994000) Calling .GetSSHUsername
	I1218 15:27:32.964872    7401 sshutil.go:53] new ssh client: &{IP:192.169.0.42 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17822-999/.minikube/machines/no-preload-994000/id_rsa Username:docker}
	I1218 15:27:32.964964    7401 sshutil.go:53] new ssh client: &{IP:192.169.0.42 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17822-999/.minikube/machines/no-preload-994000/id_rsa Username:docker}
	I1218 15:27:32.997249    7401 ssh_runner.go:195] Run: systemctl --version
	I1218 15:27:33.001131    7401 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1218 15:27:33.052746    7401 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1218 15:27:33.052829    7401 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1218 15:27:33.063564    7401 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I1218 15:27:33.063579    7401 start.go:475] detecting cgroup driver to use...
	I1218 15:27:33.063682    7401 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1218 15:27:33.076930    7401 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I1218 15:27:33.083457    7401 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1218 15:27:33.089783    7401 containerd.go:145] configuring containerd to use "cgroupfs" as cgroup driver...
	I1218 15:27:33.089829    7401 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1218 15:27:33.096210    7401 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1218 15:27:33.102675    7401 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1218 15:27:33.109146    7401 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1218 15:27:33.115599    7401 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1218 15:27:33.122211    7401 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1218 15:27:33.128714    7401 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1218 15:27:33.134547    7401 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1218 15:27:33.140534    7401 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 15:27:33.222210    7401 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1218 15:27:33.234431    7401 start.go:475] detecting cgroup driver to use...
	I1218 15:27:33.234500    7401 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I1218 15:27:33.246702    7401 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I1218 15:27:33.260800    7401 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I1218 15:27:33.275349    7401 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I1218 15:27:33.283673    7401 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1218 15:27:33.292462    7401 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1218 15:27:33.312845    7401 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1218 15:27:33.321122    7401 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1218 15:27:33.333939    7401 ssh_runner.go:195] Run: which cri-dockerd
	I1218 15:27:33.336344    7401 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I1218 15:27:33.341888    7401 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I1218 15:27:33.353231    7401 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I1218 15:27:33.445349    7401 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I1218 15:27:33.533212    7401 docker.go:560] configuring docker to use "cgroupfs" as cgroup driver...
	I1218 15:27:33.533289    7401 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I1218 15:27:33.544927    7401 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 15:27:33.628424    7401 ssh_runner.go:195] Run: sudo systemctl restart docker
	I1218 15:27:34.852996    7401 ssh_runner.go:235] Completed: sudo systemctl restart docker: (1.224533927s)
	I1218 15:27:34.853054    7401 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I1218 15:27:34.937142    7401 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I1218 15:27:35.021977    7401 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I1218 15:27:35.119991    7401 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 15:27:35.219762    7401 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I1218 15:27:35.230950    7401 ssh_runner.go:195] Run: sudo journalctl --no-pager -u cri-docker.socket
	I1218 15:27:35.261671    7401 out.go:177] 
	W1218 15:27:35.283617    7401 out.go:239] X Exiting due to RUNTIME_ENABLE: Failed to enable container runtime: sudo systemctl restart cri-docker.socket: Process exited with status 1
	stdout:
	
	stderr:
	Job failed. See "journalctl -xe" for details.
	
	sudo journalctl --no-pager -u cri-docker.socket:
	-- stdout --
	-- Journal begins at Mon 2023-12-18 23:27:29 UTC, ends at Mon 2023-12-18 23:27:35 UTC. --
	Dec 18 23:27:30 minikube systemd[1]: Starting CRI Docker Socket for the API.
	Dec 18 23:27:30 minikube systemd[1]: Listening on CRI Docker Socket for the API.
	Dec 18 23:27:32 no-preload-994000 systemd[1]: cri-docker.socket: Succeeded.
	Dec 18 23:27:32 no-preload-994000 systemd[1]: Closed CRI Docker Socket for the API.
	Dec 18 23:27:32 no-preload-994000 systemd[1]: Stopping CRI Docker Socket for the API.
	Dec 18 23:27:32 no-preload-994000 systemd[1]: Starting CRI Docker Socket for the API.
	Dec 18 23:27:32 no-preload-994000 systemd[1]: Listening on CRI Docker Socket for the API.
	Dec 18 23:27:35 no-preload-994000 systemd[1]: cri-docker.socket: Succeeded.
	Dec 18 23:27:35 no-preload-994000 systemd[1]: Closed CRI Docker Socket for the API.
	Dec 18 23:27:35 no-preload-994000 systemd[1]: Stopping CRI Docker Socket for the API.
	Dec 18 23:27:35 no-preload-994000 systemd[1]: cri-docker.socket: Socket service cri-docker.service already active, refusing.
	Dec 18 23:27:35 no-preload-994000 systemd[1]: Failed to listen on CRI Docker Socket for the API.
	
	-- /stdout --
	X Exiting due to RUNTIME_ENABLE: Failed to enable container runtime: sudo systemctl restart cri-docker.socket: Process exited with status 1
	stdout:
	
	stderr:
	Job failed. See "journalctl -xe" for details.
	
	sudo journalctl --no-pager -u cri-docker.socket:
	-- stdout --
	-- Journal begins at Mon 2023-12-18 23:27:29 UTC, ends at Mon 2023-12-18 23:27:35 UTC. --
	Dec 18 23:27:30 minikube systemd[1]: Starting CRI Docker Socket for the API.
	Dec 18 23:27:30 minikube systemd[1]: Listening on CRI Docker Socket for the API.
	Dec 18 23:27:32 no-preload-994000 systemd[1]: cri-docker.socket: Succeeded.
	Dec 18 23:27:32 no-preload-994000 systemd[1]: Closed CRI Docker Socket for the API.
	Dec 18 23:27:32 no-preload-994000 systemd[1]: Stopping CRI Docker Socket for the API.
	Dec 18 23:27:32 no-preload-994000 systemd[1]: Starting CRI Docker Socket for the API.
	Dec 18 23:27:32 no-preload-994000 systemd[1]: Listening on CRI Docker Socket for the API.
	Dec 18 23:27:35 no-preload-994000 systemd[1]: cri-docker.socket: Succeeded.
	Dec 18 23:27:35 no-preload-994000 systemd[1]: Closed CRI Docker Socket for the API.
	Dec 18 23:27:35 no-preload-994000 systemd[1]: Stopping CRI Docker Socket for the API.
	Dec 18 23:27:35 no-preload-994000 systemd[1]: cri-docker.socket: Socket service cri-docker.service already active, refusing.
	Dec 18 23:27:35 no-preload-994000 systemd[1]: Failed to listen on CRI Docker Socket for the API.
	
	-- /stdout --
	W1218 15:27:35.283657    7401 out.go:239] * 
	* 
	W1218 15:27:35.284915    7401 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 15:27:35.349529    7401 out.go:177] 

                                                
                                                
** /stderr **
start_stop_delete_test.go:188: failed starting minikube -first start-. args "out/minikube-darwin-amd64 start -p no-preload-994000 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.29.0-rc.2": exit status 90
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-994000 -n no-preload-994000
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-994000 -n no-preload-994000: exit status 6 (145.365938ms)

                                                
                                                
-- stdout --
	Running
	WARNING: Your kubectl is pointing to stale minikube-vm.
	To fix the kubectl context, run `minikube update-context`

                                                
                                                
-- /stdout --
** stderr ** 
	E1218 15:27:35.579549    7444 status.go:415] kubeconfig endpoint: extract IP: "no-preload-994000" does not appear in /Users/jenkins/minikube-integration/17822-999/kubeconfig

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 6 (may be ok)
helpers_test.go:241: "no-preload-994000" host is not running, skipping log retrieval (state="Running\nWARNING: Your kubectl is pointing to stale minikube-vm.\nTo fix the kubectl context, run `minikube update-context`")
--- FAIL: TestStartStop/group/no-preload/serial/FirstStart (14.95s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/DeployApp (0.33s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context no-preload-994000 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) Non-zero exit: kubectl --context no-preload-994000 create -f testdata/busybox.yaml: exit status 1 (35.493258ms)

                                                
                                                
** stderr ** 
	error: no openapi getter

                                                
                                                
** /stderr **
start_stop_delete_test.go:196: kubectl --context no-preload-994000 create -f testdata/busybox.yaml failed: exit status 1
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-994000 -n no-preload-994000
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-994000 -n no-preload-994000: exit status 6 (145.103427ms)

                                                
                                                
-- stdout --
	Running
	WARNING: Your kubectl is pointing to stale minikube-vm.
	To fix the kubectl context, run `minikube update-context`

                                                
                                                
-- /stdout --
** stderr ** 
	E1218 15:27:35.761135    7450 status.go:415] kubeconfig endpoint: extract IP: "no-preload-994000" does not appear in /Users/jenkins/minikube-integration/17822-999/kubeconfig

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 6 (may be ok)
helpers_test.go:241: "no-preload-994000" host is not running, skipping log retrieval (state="Running\nWARNING: Your kubectl is pointing to stale minikube-vm.\nTo fix the kubectl context, run `minikube update-context`")
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-994000 -n no-preload-994000
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-994000 -n no-preload-994000: exit status 6 (144.191876ms)

                                                
                                                
-- stdout --
	Running
	WARNING: Your kubectl is pointing to stale minikube-vm.
	To fix the kubectl context, run `minikube update-context`

                                                
                                                
-- /stdout --
** stderr ** 
	E1218 15:27:35.905856    7455 status.go:415] kubeconfig endpoint: extract IP: "no-preload-994000" does not appear in /Users/jenkins/minikube-integration/17822-999/kubeconfig

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 6 (may be ok)
helpers_test.go:241: "no-preload-994000" host is not running, skipping log retrieval (state="Running\nWARNING: Your kubectl is pointing to stale minikube-vm.\nTo fix the kubectl context, run `minikube update-context`")
--- FAIL: TestStartStop/group/no-preload/serial/DeployApp (0.33s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/EnableAddonWhileActive (117.6s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p no-preload-994000 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain
E1218 15:27:43.282467    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/calico-302000/client.crt: no such file or directory
E1218 15:27:43.798268    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/custom-flannel-302000/client.crt: no such file or directory
E1218 15:27:55.210297    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/auto-302000/client.crt: no such file or directory
E1218 15:27:57.868435    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
start_stop_delete_test.go:205: (dbg) Non-zero exit: out/minikube-darwin-amd64 addons enable metrics-server -p no-preload-994000 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain: exit status 10 (1m57.419411203s)

                                                
                                                
-- stdout --
	* metrics-server is an addon maintained by Kubernetes. For any concerns contact minikube on GitHub.
	You can view the list of minikube maintainers at: https://github.com/kubernetes/minikube/blob/master/OWNERS
	  - Using image fake.domain/registry.k8s.io/echoserver:1.4
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to MK_ADDON_ENABLE: enable failed: run callbacks: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.29.0-rc.2/kubectl apply --force -f /etc/kubernetes/addons/metrics-apiservice.yaml -f /etc/kubernetes/addons/metrics-server-deployment.yaml -f /etc/kubernetes/addons/metrics-server-rbac.yaml -f /etc/kubernetes/addons/metrics-server-service.yaml: Process exited with status 1
	stdout:
	
	stderr:
	sudo: /var/lib/minikube/binaries/v1.29.0-rc.2/kubectl: command not found
	]
	* 
	╭───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                                                           │
	│    * If the above advice does not help, please let us know:                                                               │
	│      https://github.com/kubernetes/minikube/issues/new/choose                                                             │
	│                                                                                                                           │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.                                  │
	│    * Please also attach the following file to the GitHub issue:                                                           │
	│    * - /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube_addons_2bafae6fa40fec163538f94366e390b0317a8b15_0.log    │
	│                                                                                                                           │
	╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
start_stop_delete_test.go:207: failed to enable an addon post-stop. args "out/minikube-darwin-amd64 addons enable metrics-server -p no-preload-994000 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain": exit status 10
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context no-preload-994000 describe deploy/metrics-server -n kube-system
start_stop_delete_test.go:215: (dbg) Non-zero exit: kubectl --context no-preload-994000 describe deploy/metrics-server -n kube-system: exit status 1 (35.875417ms)

                                                
                                                
** stderr ** 
	error: context "no-preload-994000" does not exist

                                                
                                                
** /stderr **
start_stop_delete_test.go:217: failed to get info on auto-pause deployments. args "kubectl --context no-preload-994000 describe deploy/metrics-server -n kube-system": exit status 1
start_stop_delete_test.go:221: addon did not load correct image. Expected to contain " fake.domain/registry.k8s.io/echoserver:1.4". Addon deployment info: 
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-994000 -n no-preload-994000
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-994000 -n no-preload-994000: exit status 6 (143.069358ms)

                                                
                                                
-- stdout --
	Running
	WARNING: Your kubectl is pointing to stale minikube-vm.
	To fix the kubectl context, run `minikube update-context`

                                                
                                                
-- /stdout --
** stderr ** 
	E1218 15:29:33.506773    7566 status.go:415] kubeconfig endpoint: extract IP: "no-preload-994000" does not appear in /Users/jenkins/minikube-integration/17822-999/kubeconfig

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 6 (may be ok)
helpers_test.go:241: "no-preload-994000" host is not running, skipping log retrieval (state="Running\nWARNING: Your kubectl is pointing to stale minikube-vm.\nTo fix the kubectl context, run `minikube update-context`")
--- FAIL: TestStartStop/group/no-preload/serial/EnableAddonWhileActive (117.60s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/SecondStart (484.34s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p embed-certs-732000 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=hyperkit  --kubernetes-version=v1.28.4
E1218 15:32:57.872292    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
E1218 15:33:13.649157    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kubenet-302000/client.crt: no such file or directory
E1218 15:33:22.750910    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/false-302000/client.crt: no such file or directory
E1218 15:33:36.105486    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/enable-default-cni-302000/client.crt: no such file or directory
E1218 15:33:39.130492    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
E1218 15:33:50.440968    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/false-302000/client.crt: no such file or directory
E1218 15:34:03.831141    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/enable-default-cni-302000/client.crt: no such file or directory
E1218 15:34:35.571709    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kubenet-302000/client.crt: no such file or directory
E1218 15:34:40.418310    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
E1218 15:34:51.455199    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/flannel-302000/client.crt: no such file or directory
E1218 15:34:55.594910    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/bridge-302000/client.crt: no such file or directory
E1218 15:34:57.367200    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
E1218 15:35:11.365208    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/auto-302000/client.crt: no such file or directory
E1218 15:35:19.142856    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/flannel-302000/client.crt: no such file or directory
E1218 15:35:23.287286    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/bridge-302000/client.crt: no such file or directory
E1218 15:35:27.000870    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kindnet-302000/client.crt: no such file or directory
E1218 15:35:28.799403    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
E1218 15:36:22.282054    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/no-preload-994000/client.crt: no such file or directory
E1218 15:36:22.288399    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/no-preload-994000/client.crt: no such file or directory
E1218 15:36:22.299907    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/no-preload-994000/client.crt: no such file or directory
E1218 15:36:22.321638    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/no-preload-994000/client.crt: no such file or directory
E1218 15:36:22.361898    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/no-preload-994000/client.crt: no such file or directory
E1218 15:36:22.442115    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/no-preload-994000/client.crt: no such file or directory
E1218 15:36:22.603673    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/no-preload-994000/client.crt: no such file or directory
E1218 15:36:22.923918    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/no-preload-994000/client.crt: no such file or directory
E1218 15:36:23.565987    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/no-preload-994000/client.crt: no such file or directory
E1218 15:36:24.847260    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/no-preload-994000/client.crt: no such file or directory
E1218 15:36:27.407483    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/no-preload-994000/client.crt: no such file or directory
E1218 15:36:32.529007    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/no-preload-994000/client.crt: no such file or directory
E1218 15:36:42.769860    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/no-preload-994000/client.crt: no such file or directory
start_stop_delete_test.go:256: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p embed-certs-732000 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=hyperkit  --kubernetes-version=v1.28.4: exit status 80 (6m49.190696741s)

                                                
                                                
-- stdout --
	* [embed-certs-732000] minikube v1.32.0 on Darwin 14.2
	  - MINIKUBE_LOCATION=17822
	  - KUBECONFIG=/Users/jenkins/minikube-integration/17822-999/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17822-999/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on existing profile
	* Starting control plane node embed-certs-732000 in cluster embed-certs-732000
	* Restarting existing hyperkit VM for "embed-certs-732000" ...
	* Preparing Kubernetes v1.28.4 on Docker 24.0.7 ...
	* Configuring bridge CNI (Container Networking Interface) ...
	* Verifying Kubernetes components...
	  - Using image fake.domain/registry.k8s.io/echoserver:1.4
	  - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	  - Using image docker.io/kubernetesui/dashboard:v2.7.0
	  - Using image registry.k8s.io/echoserver:1.4
	* Some dashboard features require the metrics-server addon. To enable all features please run:
	
		minikube -p embed-certs-732000 addons enable metrics-server	
	
	
	* Enabled addons: storage-provisioner, metrics-server, default-storageclass, dashboard
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 15:32:50.144685    7817 out.go:296] Setting OutFile to fd 1 ...
	I1218 15:32:50.144967    7817 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1218 15:32:50.144973    7817 out.go:309] Setting ErrFile to fd 2...
	I1218 15:32:50.144978    7817 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1218 15:32:50.145165    7817 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17822-999/.minikube/bin
	I1218 15:32:50.146521    7817 out.go:303] Setting JSON to false
	I1218 15:32:50.169032    7817 start.go:128] hostinfo: {"hostname":"MacOS-Agent-2.local","uptime":3741,"bootTime":1702938629,"procs":434,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.2","kernelVersion":"23.2.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"2965c349-98a5-5970-aaa9-9eedd3ae5959"}
	W1218 15:32:50.169123    7817 start.go:136] gopshost.Virtualization returned error: not implemented yet
	I1218 15:32:50.190847    7817 out.go:177] * [embed-certs-732000] minikube v1.32.0 on Darwin 14.2
	I1218 15:32:50.232473    7817 out.go:177]   - MINIKUBE_LOCATION=17822
	I1218 15:32:50.232520    7817 notify.go:220] Checking for updates...
	I1218 15:32:50.254528    7817 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/17822-999/kubeconfig
	I1218 15:32:50.275767    7817 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I1218 15:32:50.296535    7817 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 15:32:50.317516    7817 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17822-999/.minikube
	I1218 15:32:50.338591    7817 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 15:32:50.360425    7817 config.go:182] Loaded profile config "embed-certs-732000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
	I1218 15:32:50.361092    7817 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 15:32:50.361172    7817 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 15:32:50.370370    7817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:56552
	I1218 15:32:50.370730    7817 main.go:141] libmachine: () Calling .GetVersion
	I1218 15:32:50.371177    7817 main.go:141] libmachine: Using API Version  1
	I1218 15:32:50.371194    7817 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 15:32:50.371405    7817 main.go:141] libmachine: () Calling .GetMachineName
	I1218 15:32:50.371514    7817 main.go:141] libmachine: (embed-certs-732000) Calling .DriverName
	I1218 15:32:50.371691    7817 driver.go:392] Setting default libvirt URI to qemu:///system
	I1218 15:32:50.371919    7817 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 15:32:50.371938    7817 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 15:32:50.379680    7817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:56554
	I1218 15:32:50.380007    7817 main.go:141] libmachine: () Calling .GetVersion
	I1218 15:32:50.380390    7817 main.go:141] libmachine: Using API Version  1
	I1218 15:32:50.380408    7817 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 15:32:50.380623    7817 main.go:141] libmachine: () Calling .GetMachineName
	I1218 15:32:50.380726    7817 main.go:141] libmachine: (embed-certs-732000) Calling .DriverName
	I1218 15:32:50.409296    7817 out.go:177] * Using the hyperkit driver based on existing profile
	I1218 15:32:50.451438    7817 start.go:298] selected driver: hyperkit
	I1218 15:32:50.451466    7817 start.go:902] validating driver "hyperkit" against &{Name:embed-certs-732000 KeepContext:false EmbedCerts:true MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/17806/minikube-v1.32.1-1702708929-17806-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1702920864-17822@sha256:4842b362f06b33d847d73f7ed166c93ce608f4c4cea49b711c7055fd50ebd1e0 Memory:2200 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 Kubernet
esConfig:{KubernetesVersion:v1.28.4 ClusterName:embed-certs-732000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP:192.169.0.43 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[dashboard:true default-storageclass:true metrics-server:true storage-provisioner:true] CustomAddonImages:map[MetricsScraper:registry.k8s.io/echoserver:1.4 MetricsServer:registry.k8s.io/echoserver:1.4] CustomAddonRegistries:map[MetricsServer:fake.domain] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPo
rts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 AutoPauseInterval:1m0s GPUs:}
	I1218 15:32:50.451665    7817 start.go:913] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 15:32:50.455862    7817 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 15:32:50.455961    7817 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/17822-999/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I1218 15:32:50.463671    7817 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.32.0
	I1218 15:32:50.467577    7817 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 15:32:50.467605    7817 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I1218 15:32:50.467751    7817 start_flags.go:931] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1218 15:32:50.467821    7817 cni.go:84] Creating CNI manager for ""
	I1218 15:32:50.467833    7817 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I1218 15:32:50.467847    7817 start_flags.go:323] config:
	{Name:embed-certs-732000 KeepContext:false EmbedCerts:true MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/17806/minikube-v1.32.1-1702708929-17806-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1702920864-17822@sha256:4842b362f06b33d847d73f7ed166c93ce608f4c4cea49b711c7055fd50ebd1e0 Memory:2200 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.4 ClusterName:embed-certs-732000 Namespace:defau
lt APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP:192.169.0.43 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[dashboard:true default-storageclass:true metrics-server:true storage-provisioner:true] CustomAddonImages:map[MetricsScraper:registry.k8s.io/echoserver:1.4 MetricsServer:registry.k8s.io/echoserver:1.4] CustomAddonRegistries:map[MetricsServer:fake.domain] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertE
xpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 AutoPauseInterval:1m0s GPUs:}
	I1218 15:32:50.467976    7817 iso.go:125] acquiring lock: {Name:mk6c2133f2dd3312b15d4fc195383881e10096e9 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 15:32:50.510449    7817 out.go:177] * Starting control plane node embed-certs-732000 in cluster embed-certs-732000
	I1218 15:32:50.531742    7817 preload.go:132] Checking if preload exists for k8s version v1.28.4 and runtime docker
	I1218 15:32:50.531803    7817 preload.go:148] Found local preload: /Users/jenkins/minikube-integration/17822-999/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-docker-overlay2-amd64.tar.lz4
	I1218 15:32:50.531830    7817 cache.go:56] Caching tarball of preloaded images
	I1218 15:32:50.531979    7817 preload.go:174] Found /Users/jenkins/minikube-integration/17822-999/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I1218 15:32:50.532028    7817 cache.go:59] Finished verifying existence of preloaded tar for  v1.28.4 on docker
	I1218 15:32:50.532182    7817 profile.go:148] Saving config to /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/embed-certs-732000/config.json ...
	I1218 15:32:50.532826    7817 start.go:365] acquiring machines lock for embed-certs-732000: {Name:mk129da0b7e14236047c6f70b7fc622a9cc1d994 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I1218 15:32:50.532921    7817 start.go:369] acquired machines lock for "embed-certs-732000" in 71.601µs
	I1218 15:32:50.532946    7817 start.go:96] Skipping create...Using existing machine configuration
	I1218 15:32:50.532958    7817 fix.go:54] fixHost starting: 
	I1218 15:32:50.533267    7817 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 15:32:50.533295    7817 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 15:32:50.542200    7817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:56556
	I1218 15:32:50.542585    7817 main.go:141] libmachine: () Calling .GetVersion
	I1218 15:32:50.542961    7817 main.go:141] libmachine: Using API Version  1
	I1218 15:32:50.542971    7817 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 15:32:50.543187    7817 main.go:141] libmachine: () Calling .GetMachineName
	I1218 15:32:50.543298    7817 main.go:141] libmachine: (embed-certs-732000) Calling .DriverName
	I1218 15:32:50.543398    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetState
	I1218 15:32:50.543498    7817 main.go:141] libmachine: (embed-certs-732000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:32:50.543560    7817 main.go:141] libmachine: (embed-certs-732000) DBG | hyperkit pid from json: 7763
	I1218 15:32:50.544509    7817 main.go:141] libmachine: (embed-certs-732000) DBG | hyperkit pid 7763 missing from process table
	I1218 15:32:50.544546    7817 fix.go:102] recreateIfNeeded on embed-certs-732000: state=Stopped err=<nil>
	I1218 15:32:50.544567    7817 main.go:141] libmachine: (embed-certs-732000) Calling .DriverName
	W1218 15:32:50.544653    7817 fix.go:128] unexpected machine state, will restart: <nil>
	I1218 15:32:50.586703    7817 out.go:177] * Restarting existing hyperkit VM for "embed-certs-732000" ...
	I1218 15:32:50.609582    7817 main.go:141] libmachine: (embed-certs-732000) Calling .Start
	I1218 15:32:50.609892    7817 main.go:141] libmachine: (embed-certs-732000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:32:50.609988    7817 main.go:141] libmachine: (embed-certs-732000) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000/hyperkit.pid
	I1218 15:32:50.611877    7817 main.go:141] libmachine: (embed-certs-732000) DBG | hyperkit pid 7763 missing from process table
	I1218 15:32:50.611905    7817 main.go:141] libmachine: (embed-certs-732000) DBG | pid 7763 is in state "Stopped"
	I1218 15:32:50.611949    7817 main.go:141] libmachine: (embed-certs-732000) DBG | Removing stale pid file /Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000/hyperkit.pid...
	I1218 15:32:50.612116    7817 main.go:141] libmachine: (embed-certs-732000) DBG | Using UUID 9928e388-9dfd-11ee-b9a2-f01898ef957c
	I1218 15:32:50.639724    7817 main.go:141] libmachine: (embed-certs-732000) DBG | Generated MAC ae:ea:f0:6a:c1:e9
	I1218 15:32:50.639748    7817 main.go:141] libmachine: (embed-certs-732000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=embed-certs-732000
	I1218 15:32:50.639925    7817 main.go:141] libmachine: (embed-certs-732000) DBG | 2023/12/18 15:32:50 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"9928e388-9dfd-11ee-b9a2-f01898ef957c", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc000464d50)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000/bzimage", Initrd:"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os
.Process)(nil)}
	I1218 15:32:50.639993    7817 main.go:141] libmachine: (embed-certs-732000) DBG | 2023/12/18 15:32:50 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"9928e388-9dfd-11ee-b9a2-f01898ef957c", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc000464d50)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000/bzimage", Initrd:"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os
.Process)(nil)}
	I1218 15:32:50.640047    7817 main.go:141] libmachine: (embed-certs-732000) DBG | 2023/12/18 15:32:50 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "9928e388-9dfd-11ee-b9a2-f01898ef957c", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000/embed-certs-732000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000/tty,log=/Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000/bzimage,/Users/jenkins/minikube-integr
ation/17822-999/.minikube/machines/embed-certs-732000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=embed-certs-732000"}
	I1218 15:32:50.640081    7817 main.go:141] libmachine: (embed-certs-732000) DBG | 2023/12/18 15:32:50 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 9928e388-9dfd-11ee-b9a2-f01898ef957c -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000/embed-certs-732000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000/tty,log=/Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000/console-ring -f kexec,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000/bzimage,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000/initrd,e
arlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=embed-certs-732000"
	I1218 15:32:50.640095    7817 main.go:141] libmachine: (embed-certs-732000) DBG | 2023/12/18 15:32:50 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I1218 15:32:50.641502    7817 main.go:141] libmachine: (embed-certs-732000) DBG | 2023/12/18 15:32:50 DEBUG: hyperkit: Pid is 7828
	I1218 15:32:50.641862    7817 main.go:141] libmachine: (embed-certs-732000) DBG | Attempt 0
	I1218 15:32:50.641881    7817 main.go:141] libmachine: (embed-certs-732000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:32:50.641919    7817 main.go:141] libmachine: (embed-certs-732000) DBG | hyperkit pid from json: 7828
	I1218 15:32:50.643595    7817 main.go:141] libmachine: (embed-certs-732000) DBG | Searching for ae:ea:f0:6a:c1:e9 in /var/db/dhcpd_leases ...
	I1218 15:32:50.643688    7817 main.go:141] libmachine: (embed-certs-732000) DBG | Found 42 entries in /var/db/dhcpd_leases!
	I1218 15:32:50.643711    7817 main.go:141] libmachine: (embed-certs-732000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.43 HWAddress:ae:ea:f0:6a:c1:e9 ID:1,ae:ea:f0:6a:c1:e9 Lease:0x658227e6}
	I1218 15:32:50.643725    7817 main.go:141] libmachine: (embed-certs-732000) DBG | Found match: ae:ea:f0:6a:c1:e9
	I1218 15:32:50.643735    7817 main.go:141] libmachine: (embed-certs-732000) DBG | IP: 192.169.0.43
	I1218 15:32:50.643789    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetConfigRaw
	I1218 15:32:50.644453    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetIP
	I1218 15:32:50.644606    7817 profile.go:148] Saving config to /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/embed-certs-732000/config.json ...
	I1218 15:32:50.644966    7817 machine.go:88] provisioning docker machine ...
	I1218 15:32:50.644977    7817 main.go:141] libmachine: (embed-certs-732000) Calling .DriverName
	I1218 15:32:50.645083    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetMachineName
	I1218 15:32:50.645201    7817 buildroot.go:166] provisioning hostname "embed-certs-732000"
	I1218 15:32:50.645211    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetMachineName
	I1218 15:32:50.645332    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHHostname
	I1218 15:32:50.645427    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHPort
	I1218 15:32:50.645567    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHKeyPath
	I1218 15:32:50.645687    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHKeyPath
	I1218 15:32:50.645792    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHUsername
	I1218 15:32:50.645917    7817 main.go:141] libmachine: Using SSH client type: native
	I1218 15:32:50.646227    7817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1406660] 0x1409340 <nil>  [] 0s} 192.169.0.43 22 <nil> <nil>}
	I1218 15:32:50.646239    7817 main.go:141] libmachine: About to run SSH command:
	sudo hostname embed-certs-732000 && echo "embed-certs-732000" | sudo tee /etc/hostname
	I1218 15:32:50.649715    7817 main.go:141] libmachine: (embed-certs-732000) DBG | 2023/12/18 15:32:50 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I1218 15:32:50.659066    7817 main.go:141] libmachine: (embed-certs-732000) DBG | 2023/12/18 15:32:50 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I1218 15:32:50.660019    7817 main.go:141] libmachine: (embed-certs-732000) DBG | 2023/12/18 15:32:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I1218 15:32:50.660042    7817 main.go:141] libmachine: (embed-certs-732000) DBG | 2023/12/18 15:32:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I1218 15:32:50.660100    7817 main.go:141] libmachine: (embed-certs-732000) DBG | 2023/12/18 15:32:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I1218 15:32:50.660122    7817 main.go:141] libmachine: (embed-certs-732000) DBG | 2023/12/18 15:32:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I1218 15:32:51.030098    7817 main.go:141] libmachine: (embed-certs-732000) DBG | 2023/12/18 15:32:51 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I1218 15:32:51.030115    7817 main.go:141] libmachine: (embed-certs-732000) DBG | 2023/12/18 15:32:51 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I1218 15:32:51.134224    7817 main.go:141] libmachine: (embed-certs-732000) DBG | 2023/12/18 15:32:51 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I1218 15:32:51.134246    7817 main.go:141] libmachine: (embed-certs-732000) DBG | 2023/12/18 15:32:51 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I1218 15:32:51.134257    7817 main.go:141] libmachine: (embed-certs-732000) DBG | 2023/12/18 15:32:51 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I1218 15:32:51.134272    7817 main.go:141] libmachine: (embed-certs-732000) DBG | 2023/12/18 15:32:51 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I1218 15:32:51.135107    7817 main.go:141] libmachine: (embed-certs-732000) DBG | 2023/12/18 15:32:51 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I1218 15:32:51.135118    7817 main.go:141] libmachine: (embed-certs-732000) DBG | 2023/12/18 15:32:51 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I1218 15:32:56.060143    7817 main.go:141] libmachine: (embed-certs-732000) DBG | 2023/12/18 15:32:56 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I1218 15:32:56.060161    7817 main.go:141] libmachine: (embed-certs-732000) DBG | 2023/12/18 15:32:56 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I1218 15:32:56.060171    7817 main.go:141] libmachine: (embed-certs-732000) DBG | 2023/12/18 15:32:56 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I1218 15:33:03.837470    7817 main.go:141] libmachine: SSH cmd err, output: <nil>: embed-certs-732000
	
	I1218 15:33:03.837497    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHHostname
	I1218 15:33:03.837643    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHPort
	I1218 15:33:03.837759    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHKeyPath
	I1218 15:33:03.837844    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHKeyPath
	I1218 15:33:03.837918    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHUsername
	I1218 15:33:03.838056    7817 main.go:141] libmachine: Using SSH client type: native
	I1218 15:33:03.838318    7817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1406660] 0x1409340 <nil>  [] 0s} 192.169.0.43 22 <nil> <nil>}
	I1218 15:33:03.838330    7817 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sembed-certs-732000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 embed-certs-732000/g' /etc/hosts;
				else 
					echo '127.0.1.1 embed-certs-732000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1218 15:33:03.913738    7817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I1218 15:33:03.913758    7817 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/17822-999/.minikube CaCertPath:/Users/jenkins/minikube-integration/17822-999/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/17822-999/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/17822-999/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/17822-999/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/17822-999/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/17822-999/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/17822-999/.minikube}
	I1218 15:33:03.913773    7817 buildroot.go:174] setting up certificates
	I1218 15:33:03.913785    7817 provision.go:83] configureAuth start
	I1218 15:33:03.913793    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetMachineName
	I1218 15:33:03.913926    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetIP
	I1218 15:33:03.914019    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHHostname
	I1218 15:33:03.914106    7817 provision.go:138] copyHostCerts
	I1218 15:33:03.914194    7817 exec_runner.go:144] found /Users/jenkins/minikube-integration/17822-999/.minikube/ca.pem, removing ...
	I1218 15:33:03.914203    7817 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/17822-999/.minikube/ca.pem
	I1218 15:33:03.914346    7817 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/17822-999/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/17822-999/.minikube/ca.pem (1082 bytes)
	I1218 15:33:03.914575    7817 exec_runner.go:144] found /Users/jenkins/minikube-integration/17822-999/.minikube/cert.pem, removing ...
	I1218 15:33:03.914586    7817 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/17822-999/.minikube/cert.pem
	I1218 15:33:03.914666    7817 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/17822-999/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/17822-999/.minikube/cert.pem (1123 bytes)
	I1218 15:33:03.914838    7817 exec_runner.go:144] found /Users/jenkins/minikube-integration/17822-999/.minikube/key.pem, removing ...
	I1218 15:33:03.914844    7817 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/17822-999/.minikube/key.pem
	I1218 15:33:03.914920    7817 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/17822-999/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/17822-999/.minikube/key.pem (1675 bytes)
	I1218 15:33:03.915068    7817 provision.go:112] generating server cert: /Users/jenkins/minikube-integration/17822-999/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/17822-999/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/17822-999/.minikube/certs/ca-key.pem org=jenkins.embed-certs-732000 san=[192.169.0.43 192.169.0.43 localhost 127.0.0.1 minikube embed-certs-732000]
	I1218 15:33:04.043499    7817 provision.go:172] copyRemoteCerts
	I1218 15:33:04.043560    7817 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1218 15:33:04.043578    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHHostname
	I1218 15:33:04.043713    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHPort
	I1218 15:33:04.043874    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHKeyPath
	I1218 15:33:04.043975    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHUsername
	I1218 15:33:04.044060    7817 sshutil.go:53] new ssh client: &{IP:192.169.0.43 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000/id_rsa Username:docker}
	I1218 15:33:04.085940    7817 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17822-999/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I1218 15:33:04.101916    7817 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17822-999/.minikube/machines/server.pem --> /etc/docker/server.pem (1233 bytes)
	I1218 15:33:04.117730    7817 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17822-999/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I1218 15:33:04.133541    7817 provision.go:86] duration metric: configureAuth took 219.738294ms
	I1218 15:33:04.133557    7817 buildroot.go:189] setting minikube options for container-runtime
	I1218 15:33:04.133684    7817 config.go:182] Loaded profile config "embed-certs-732000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
	I1218 15:33:04.133697    7817 main.go:141] libmachine: (embed-certs-732000) Calling .DriverName
	I1218 15:33:04.133830    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHHostname
	I1218 15:33:04.133938    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHPort
	I1218 15:33:04.134025    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHKeyPath
	I1218 15:33:04.134114    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHKeyPath
	I1218 15:33:04.134212    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHUsername
	I1218 15:33:04.134329    7817 main.go:141] libmachine: Using SSH client type: native
	I1218 15:33:04.134571    7817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1406660] 0x1409340 <nil>  [] 0s} 192.169.0.43 22 <nil> <nil>}
	I1218 15:33:04.134585    7817 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I1218 15:33:04.207113    7817 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I1218 15:33:04.207124    7817 buildroot.go:70] root file system type: tmpfs
	I1218 15:33:04.207211    7817 provision.go:309] Updating docker unit: /lib/systemd/system/docker.service ...
	I1218 15:33:04.207228    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHHostname
	I1218 15:33:04.207382    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHPort
	I1218 15:33:04.207475    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHKeyPath
	I1218 15:33:04.207569    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHKeyPath
	I1218 15:33:04.207663    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHUsername
	I1218 15:33:04.207787    7817 main.go:141] libmachine: Using SSH client type: native
	I1218 15:33:04.208041    7817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1406660] 0x1409340 <nil>  [] 0s} 192.169.0.43 22 <nil> <nil>}
	I1218 15:33:04.208088    7817 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I1218 15:33:04.288039    7817 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I1218 15:33:04.288059    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHHostname
	I1218 15:33:04.288193    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHPort
	I1218 15:33:04.288287    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHKeyPath
	I1218 15:33:04.288371    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHKeyPath
	I1218 15:33:04.288463    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHUsername
	I1218 15:33:04.288584    7817 main.go:141] libmachine: Using SSH client type: native
	I1218 15:33:04.288831    7817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1406660] 0x1409340 <nil>  [] 0s} 192.169.0.43 22 <nil> <nil>}
	I1218 15:33:04.288844    7817 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I1218 15:33:04.883661    7817 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I1218 15:33:04.883678    7817 machine.go:91] provisioned docker machine in 14.238477747s
	I1218 15:33:04.883693    7817 start.go:300] post-start starting for "embed-certs-732000" (driver="hyperkit")
	I1218 15:33:04.883704    7817 start.go:329] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1218 15:33:04.883715    7817 main.go:141] libmachine: (embed-certs-732000) Calling .DriverName
	I1218 15:33:04.883913    7817 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1218 15:33:04.883927    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHHostname
	I1218 15:33:04.884025    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHPort
	I1218 15:33:04.884140    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHKeyPath
	I1218 15:33:04.884252    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHUsername
	I1218 15:33:04.884358    7817 sshutil.go:53] new ssh client: &{IP:192.169.0.43 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000/id_rsa Username:docker}
	I1218 15:33:04.924051    7817 ssh_runner.go:195] Run: cat /etc/os-release
	I1218 15:33:04.926727    7817 info.go:137] Remote host: Buildroot 2021.02.12
	I1218 15:33:04.926748    7817 filesync.go:126] Scanning /Users/jenkins/minikube-integration/17822-999/.minikube/addons for local assets ...
	I1218 15:33:04.926850    7817 filesync.go:126] Scanning /Users/jenkins/minikube-integration/17822-999/.minikube/files for local assets ...
	I1218 15:33:04.927024    7817 filesync.go:149] local asset: /Users/jenkins/minikube-integration/17822-999/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I1218 15:33:04.927218    7817 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I1218 15:33:04.933137    7817 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17822-999/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I1218 15:33:04.949306    7817 start.go:303] post-start completed in 65.603559ms
	I1218 15:33:04.949322    7817 fix.go:56] fixHost completed within 14.416140426s
	I1218 15:33:04.949357    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHHostname
	I1218 15:33:04.949492    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHPort
	I1218 15:33:04.949590    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHKeyPath
	I1218 15:33:04.949681    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHKeyPath
	I1218 15:33:04.949770    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHUsername
	I1218 15:33:04.949886    7817 main.go:141] libmachine: Using SSH client type: native
	I1218 15:33:04.950129    7817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1406660] 0x1409340 <nil>  [] 0s} 192.169.0.43 22 <nil> <nil>}
	I1218 15:33:04.950137    7817 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I1218 15:33:05.019371    7817 main.go:141] libmachine: SSH cmd err, output: <nil>: 1702942385.033686142
	
	I1218 15:33:05.019388    7817 fix.go:206] guest clock: 1702942385.033686142
	I1218 15:33:05.019393    7817 fix.go:219] Guest: 2023-12-18 15:33:05.033686142 -0800 PST Remote: 2023-12-18 15:33:04.949325 -0800 PST m=+14.848927169 (delta=84.361142ms)
	I1218 15:33:05.019414    7817 fix.go:190] guest clock delta is within tolerance: 84.361142ms
	I1218 15:33:05.019420    7817 start.go:83] releasing machines lock for "embed-certs-732000", held for 14.486258402s
	I1218 15:33:05.019441    7817 main.go:141] libmachine: (embed-certs-732000) Calling .DriverName
	I1218 15:33:05.019584    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetIP
	I1218 15:33:05.019689    7817 main.go:141] libmachine: (embed-certs-732000) Calling .DriverName
	I1218 15:33:05.019987    7817 main.go:141] libmachine: (embed-certs-732000) Calling .DriverName
	I1218 15:33:05.020102    7817 main.go:141] libmachine: (embed-certs-732000) Calling .DriverName
	I1218 15:33:05.020170    7817 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1218 15:33:05.020209    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHHostname
	I1218 15:33:05.020240    7817 ssh_runner.go:195] Run: cat /version.json
	I1218 15:33:05.020270    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHHostname
	I1218 15:33:05.020310    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHPort
	I1218 15:33:05.020352    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHPort
	I1218 15:33:05.020405    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHKeyPath
	I1218 15:33:05.020469    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHKeyPath
	I1218 15:33:05.020499    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHUsername
	I1218 15:33:05.020614    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHUsername
	I1218 15:33:05.020621    7817 sshutil.go:53] new ssh client: &{IP:192.169.0.43 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000/id_rsa Username:docker}
	I1218 15:33:05.020734    7817 sshutil.go:53] new ssh client: &{IP:192.169.0.43 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000/id_rsa Username:docker}
	I1218 15:33:05.057231    7817 ssh_runner.go:195] Run: systemctl --version
	I1218 15:33:05.061357    7817 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1218 15:33:05.111094    7817 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1218 15:33:05.111175    7817 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1218 15:33:05.123015    7817 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I1218 15:33:05.123027    7817 start.go:475] detecting cgroup driver to use...
	I1218 15:33:05.123142    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1218 15:33:05.134912    7817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I1218 15:33:05.142021    7817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1218 15:33:05.149147    7817 containerd.go:145] configuring containerd to use "cgroupfs" as cgroup driver...
	I1218 15:33:05.166534    7817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1218 15:33:05.175333    7817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1218 15:33:05.182541    7817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1218 15:33:05.189440    7817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1218 15:33:05.196625    7817 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1218 15:33:05.204150    7817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1218 15:33:05.211776    7817 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1218 15:33:05.218166    7817 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1218 15:33:05.224441    7817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 15:33:05.314139    7817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1218 15:33:05.326641    7817 start.go:475] detecting cgroup driver to use...
	I1218 15:33:05.326716    7817 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I1218 15:33:05.335653    7817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I1218 15:33:05.345965    7817 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I1218 15:33:05.358109    7817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I1218 15:33:05.368075    7817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1218 15:33:05.376325    7817 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1218 15:33:05.398424    7817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1218 15:33:05.407808    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1218 15:33:05.420049    7817 ssh_runner.go:195] Run: which cri-dockerd
	I1218 15:33:05.422547    7817 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I1218 15:33:05.428950    7817 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I1218 15:33:05.439884    7817 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I1218 15:33:05.522731    7817 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I1218 15:33:05.605815    7817 docker.go:560] configuring docker to use "cgroupfs" as cgroup driver...
	I1218 15:33:05.605899    7817 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I1218 15:33:05.617229    7817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 15:33:05.703311    7817 ssh_runner.go:195] Run: sudo systemctl restart docker
	I1218 15:33:06.993895    7817 ssh_runner.go:235] Completed: sudo systemctl restart docker: (1.290545229s)
	I1218 15:33:06.993961    7817 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I1218 15:33:07.091685    7817 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I1218 15:33:07.174234    7817 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I1218 15:33:07.256345    7817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 15:33:07.351352    7817 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I1218 15:33:07.366641    7817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 15:33:07.466368    7817 ssh_runner.go:195] Run: sudo systemctl restart cri-docker
	I1218 15:33:07.520640    7817 start.go:522] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I1218 15:33:07.520717    7817 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I1218 15:33:07.524511    7817 start.go:543] Will wait 60s for crictl version
	I1218 15:33:07.524564    7817 ssh_runner.go:195] Run: which crictl
	I1218 15:33:07.527452    7817 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I1218 15:33:07.566604    7817 start.go:559] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  24.0.7
	RuntimeApiVersion:  v1
	I1218 15:33:07.566677    7817 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I1218 15:33:07.585315    7817 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I1218 15:33:07.645568    7817 out.go:204] * Preparing Kubernetes v1.28.4 on Docker 24.0.7 ...
	I1218 15:33:07.645618    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetIP
	I1218 15:33:07.646005    7817 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I1218 15:33:07.650345    7817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1218 15:33:07.659370    7817 preload.go:132] Checking if preload exists for k8s version v1.28.4 and runtime docker
	I1218 15:33:07.659435    7817 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I1218 15:33:07.672736    7817 docker.go:671] Got preloaded images: -- stdout --
	registry.k8s.io/kube-apiserver:v1.28.4
	registry.k8s.io/kube-controller-manager:v1.28.4
	registry.k8s.io/kube-scheduler:v1.28.4
	registry.k8s.io/kube-proxy:v1.28.4
	registry.k8s.io/etcd:3.5.9-0
	registry.k8s.io/coredns/coredns:v1.10.1
	registry.k8s.io/pause:3.9
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28.4-glibc
	
	-- /stdout --
	I1218 15:33:07.672754    7817 docker.go:601] Images already preloaded, skipping extraction
	I1218 15:33:07.672827    7817 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I1218 15:33:07.686195    7817 docker.go:671] Got preloaded images: -- stdout --
	registry.k8s.io/kube-apiserver:v1.28.4
	registry.k8s.io/kube-controller-manager:v1.28.4
	registry.k8s.io/kube-scheduler:v1.28.4
	registry.k8s.io/kube-proxy:v1.28.4
	registry.k8s.io/etcd:3.5.9-0
	registry.k8s.io/coredns/coredns:v1.10.1
	registry.k8s.io/pause:3.9
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28.4-glibc
	
	-- /stdout --
	I1218 15:33:07.686219    7817 cache_images.go:84] Images are preloaded, skipping loading
	I1218 15:33:07.686291    7817 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I1218 15:33:07.703391    7817 cni.go:84] Creating CNI manager for ""
	I1218 15:33:07.703410    7817 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I1218 15:33:07.703429    7817 kubeadm.go:87] Using pod CIDR: 10.244.0.0/16
	I1218 15:33:07.703465    7817 kubeadm.go:176] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.169.0.43 APIServerPort:8443 KubernetesVersion:v1.28.4 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:embed-certs-732000 NodeName:embed-certs-732000 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.169.0.43"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.169.0.43 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:
/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1218 15:33:07.703547    7817 kubeadm.go:181] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.169.0.43
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/cri-dockerd.sock
	  name: "embed-certs-732000"
	  kubeletExtraArgs:
	    node-ip: 192.169.0.43
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.169.0.43"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.28.4
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1218 15:33:07.703603    7817 kubeadm.go:976] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.28.4/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --container-runtime-endpoint=unix:///var/run/cri-dockerd.sock --hostname-override=embed-certs-732000 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.43
	
	[Install]
	 config:
	{KubernetesVersion:v1.28.4 ClusterName:embed-certs-732000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:}
	I1218 15:33:07.703660    7817 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.28.4
	I1218 15:33:07.709611    7817 binaries.go:44] Found k8s binaries, skipping transfer
	I1218 15:33:07.709668    7817 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1218 15:33:07.715298    7817 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (379 bytes)
	I1218 15:33:07.726460    7817 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I1218 15:33:07.738028    7817 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2101 bytes)
	I1218 15:33:07.749564    7817 ssh_runner.go:195] Run: grep 192.169.0.43	control-plane.minikube.internal$ /etc/hosts
	I1218 15:33:07.751909    7817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.43	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1218 15:33:07.760181    7817 certs.go:56] Setting up /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/embed-certs-732000 for IP: 192.169.0.43
	I1218 15:33:07.760200    7817 certs.go:190] acquiring lock for shared ca certs: {Name:mk7279cfb00b11a2a248ef485e6eb44917fceabd Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 15:33:07.760350    7817 certs.go:199] skipping minikubeCA CA generation: /Users/jenkins/minikube-integration/17822-999/.minikube/ca.key
	I1218 15:33:07.760402    7817 certs.go:199] skipping proxyClientCA CA generation: /Users/jenkins/minikube-integration/17822-999/.minikube/proxy-client-ca.key
	I1218 15:33:07.760498    7817 certs.go:315] skipping minikube-user signed cert generation: /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/embed-certs-732000/client.key
	I1218 15:33:07.760570    7817 certs.go:315] skipping minikube signed cert generation: /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/embed-certs-732000/apiserver.key.ec188bd5
	I1218 15:33:07.760616    7817 certs.go:315] skipping aggregator signed cert generation: /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/embed-certs-732000/proxy-client.key
	I1218 15:33:07.760830    7817 certs.go:437] found cert: /Users/jenkins/minikube-integration/17822-999/.minikube/certs/Users/jenkins/minikube-integration/17822-999/.minikube/certs/1483.pem (1338 bytes)
	W1218 15:33:07.760864    7817 certs.go:433] ignoring /Users/jenkins/minikube-integration/17822-999/.minikube/certs/Users/jenkins/minikube-integration/17822-999/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I1218 15:33:07.760874    7817 certs.go:437] found cert: /Users/jenkins/minikube-integration/17822-999/.minikube/certs/Users/jenkins/minikube-integration/17822-999/.minikube/certs/ca-key.pem (1679 bytes)
	I1218 15:33:07.760908    7817 certs.go:437] found cert: /Users/jenkins/minikube-integration/17822-999/.minikube/certs/Users/jenkins/minikube-integration/17822-999/.minikube/certs/ca.pem (1082 bytes)
	I1218 15:33:07.760940    7817 certs.go:437] found cert: /Users/jenkins/minikube-integration/17822-999/.minikube/certs/Users/jenkins/minikube-integration/17822-999/.minikube/certs/cert.pem (1123 bytes)
	I1218 15:33:07.760970    7817 certs.go:437] found cert: /Users/jenkins/minikube-integration/17822-999/.minikube/certs/Users/jenkins/minikube-integration/17822-999/.minikube/certs/key.pem (1675 bytes)
	I1218 15:33:07.761033    7817 certs.go:437] found cert: /Users/jenkins/minikube-integration/17822-999/.minikube/files/etc/ssl/certs/Users/jenkins/minikube-integration/17822-999/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I1218 15:33:07.761519    7817 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/embed-certs-732000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1399 bytes)
	I1218 15:33:07.777646    7817 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/embed-certs-732000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1218 15:33:07.793601    7817 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/embed-certs-732000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1218 15:33:07.809905    7817 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/embed-certs-732000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1218 15:33:07.826194    7817 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17822-999/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1218 15:33:07.842725    7817 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17822-999/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1218 15:33:07.858814    7817 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17822-999/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1218 15:33:07.874859    7817 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17822-999/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1218 15:33:07.890885    7817 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17822-999/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I1218 15:33:07.907060    7817 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17822-999/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I1218 15:33:07.922746    7817 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17822-999/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1218 15:33:07.939002    7817 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1218 15:33:07.949968    7817 ssh_runner.go:195] Run: openssl version
	I1218 15:33:07.953620    7817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I1218 15:33:07.960070    7817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I1218 15:33:07.962875    7817 certs.go:480] hashing: -rw-r--r-- 1 root root 1338 Dec 18 22:42 /usr/share/ca-certificates/1483.pem
	I1218 15:33:07.962914    7817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I1218 15:33:07.966301    7817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I1218 15:33:07.972530    7817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I1218 15:33:07.978971    7817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I1218 15:33:07.981952    7817 certs.go:480] hashing: -rw-r--r-- 1 root root 1708 Dec 18 22:42 /usr/share/ca-certificates/14832.pem
	I1218 15:33:07.981989    7817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I1218 15:33:07.985475    7817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I1218 15:33:07.991788    7817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1218 15:33:07.998262    7817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1218 15:33:08.001187    7817 certs.go:480] hashing: -rw-r--r-- 1 root root 1111 Dec 18 22:38 /usr/share/ca-certificates/minikubeCA.pem
	I1218 15:33:08.001220    7817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1218 15:33:08.004704    7817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1218 15:33:08.011103    7817 ssh_runner.go:195] Run: ls /var/lib/minikube/certs/etcd
	I1218 15:33:08.013795    7817 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1218 15:33:08.017391    7817 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1218 15:33:08.020918    7817 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1218 15:33:08.024454    7817 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1218 15:33:08.028028    7817 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1218 15:33:08.031521    7817 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1218 15:33:08.035054    7817 kubeadm.go:404] StartCluster: {Name:embed-certs-732000 KeepContext:false EmbedCerts:true MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/17806/minikube-v1.32.1-1702708929-17806-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1702920864-17822@sha256:4842b362f06b33d847d73f7ed166c93ce608f4c4cea49b711c7055fd50ebd1e0 Memory:2200 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVe
rsion:v1.28.4 ClusterName:embed-certs-732000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP:192.169.0.43 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[dashboard:true default-storageclass:true metrics-server:true storage-provisioner:true] CustomAddonImages:map[MetricsScraper:registry.k8s.io/echoserver:1.4 MetricsServer:registry.k8s.io/echoserver:1.4] CustomAddonRegistries:map[MetricsServer:fake.domain] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress:
Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 AutoPauseInterval:1m0s GPUs:}
	I1218 15:33:08.035145    7817 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I1218 15:33:08.048464    7817 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1218 15:33:08.054646    7817 kubeadm.go:419] found existing configuration files, will attempt cluster restart
	I1218 15:33:08.054663    7817 kubeadm.go:636] restartCluster start
	I1218 15:33:08.054719    7817 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1218 15:33:08.060611    7817 kubeadm.go:127] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1218 15:33:08.060982    7817 kubeconfig.go:135] verify returned: extract IP: "embed-certs-732000" does not appear in /Users/jenkins/minikube-integration/17822-999/kubeconfig
	I1218 15:33:08.061120    7817 kubeconfig.go:146] "embed-certs-732000" context is missing from /Users/jenkins/minikube-integration/17822-999/kubeconfig - will repair!
	I1218 15:33:08.061423    7817 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/17822-999/kubeconfig: {Name:mk8154cfbf2a2bcb7be4f33617ae805aa580a4da Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 15:33:08.062842    7817 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1218 15:33:08.068381    7817 api_server.go:166] Checking apiserver status ...
	I1218 15:33:08.068425    7817 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1218 15:33:08.076377    7817 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1218 15:33:08.568835    7817 api_server.go:166] Checking apiserver status ...
	I1218 15:33:08.569062    7817 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1218 15:33:08.578503    7817 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1218 15:33:09.068491    7817 api_server.go:166] Checking apiserver status ...
	I1218 15:33:09.068550    7817 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1218 15:33:09.076170    7817 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1218 15:33:09.568995    7817 api_server.go:166] Checking apiserver status ...
	I1218 15:33:09.569253    7817 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1218 15:33:09.578325    7817 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1218 15:33:10.070483    7817 api_server.go:166] Checking apiserver status ...
	I1218 15:33:10.070642    7817 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1218 15:33:10.079905    7817 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1218 15:33:10.569603    7817 api_server.go:166] Checking apiserver status ...
	I1218 15:33:10.569775    7817 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1218 15:33:10.579427    7817 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1218 15:33:11.068720    7817 api_server.go:166] Checking apiserver status ...
	I1218 15:33:11.068807    7817 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1218 15:33:11.078346    7817 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1218 15:33:11.568502    7817 api_server.go:166] Checking apiserver status ...
	I1218 15:33:11.568598    7817 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1218 15:33:11.576335    7817 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1218 15:33:12.068989    7817 api_server.go:166] Checking apiserver status ...
	I1218 15:33:12.069081    7817 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1218 15:33:12.076862    7817 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1218 15:33:12.568555    7817 api_server.go:166] Checking apiserver status ...
	I1218 15:33:12.568616    7817 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1218 15:33:12.576833    7817 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1218 15:33:13.068561    7817 api_server.go:166] Checking apiserver status ...
	I1218 15:33:13.068615    7817 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1218 15:33:13.076618    7817 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1218 15:33:13.568640    7817 api_server.go:166] Checking apiserver status ...
	I1218 15:33:13.568826    7817 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1218 15:33:13.577772    7817 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1218 15:33:14.069452    7817 api_server.go:166] Checking apiserver status ...
	I1218 15:33:14.069588    7817 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1218 15:33:14.079063    7817 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1218 15:33:14.568591    7817 api_server.go:166] Checking apiserver status ...
	I1218 15:33:14.568672    7817 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1218 15:33:14.576553    7817 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1218 15:33:15.069684    7817 api_server.go:166] Checking apiserver status ...
	I1218 15:33:15.069873    7817 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1218 15:33:15.079082    7817 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1218 15:33:15.568714    7817 api_server.go:166] Checking apiserver status ...
	I1218 15:33:15.568826    7817 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1218 15:33:15.577655    7817 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1218 15:33:16.069013    7817 api_server.go:166] Checking apiserver status ...
	I1218 15:33:16.069177    7817 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1218 15:33:16.078485    7817 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1218 15:33:16.568653    7817 api_server.go:166] Checking apiserver status ...
	I1218 15:33:16.568763    7817 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1218 15:33:16.578088    7817 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1218 15:33:17.069386    7817 api_server.go:166] Checking apiserver status ...
	I1218 15:33:17.069606    7817 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1218 15:33:17.078308    7817 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1218 15:33:17.570183    7817 api_server.go:166] Checking apiserver status ...
	I1218 15:33:17.570320    7817 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1218 15:33:17.579728    7817 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1218 15:33:18.069532    7817 api_server.go:166] Checking apiserver status ...
	I1218 15:33:18.069641    7817 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1218 15:33:18.077834    7817 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1218 15:33:18.077850    7817 kubeadm.go:611] needs reconfigure: apiserver error: context deadline exceeded
	I1218 15:33:18.077861    7817 kubeadm.go:1135] stopping kube-system containers ...
	I1218 15:33:18.077940    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I1218 15:33:18.092111    7817 docker.go:469] Stopping containers: [bae4df0eea00 a6d450994cb2 af46bb40c64a e5bd7d5d2abc 1385b9130236 c5f46b946942 d1aea88f8ab7 c1c34392f2c4 8ed2d9614b64 d7a131de94f2 c4ac6546cfdf d4c694d47b34 bd2e3e1fa85f c8fde86d654f 479488e960e3]
	I1218 15:33:18.092195    7817 ssh_runner.go:195] Run: docker stop bae4df0eea00 a6d450994cb2 af46bb40c64a e5bd7d5d2abc 1385b9130236 c5f46b946942 d1aea88f8ab7 c1c34392f2c4 8ed2d9614b64 d7a131de94f2 c4ac6546cfdf d4c694d47b34 bd2e3e1fa85f c8fde86d654f 479488e960e3
	I1218 15:33:18.106137    7817 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1218 15:33:18.116381    7817 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1218 15:33:18.122386    7817 kubeadm.go:152] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1218 15:33:18.122434    7817 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1218 15:33:18.128163    7817 kubeadm.go:713] reconfiguring cluster from /var/tmp/minikube/kubeadm.yaml
	I1218 15:33:18.128173    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.28.4:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1218 15:33:18.200803    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.28.4:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1218 15:33:19.060559    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.28.4:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1218 15:33:19.207080    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.28.4:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1218 15:33:19.268643    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.28.4:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1218 15:33:19.324200    7817 api_server.go:52] waiting for apiserver process to appear ...
	I1218 15:33:19.324265    7817 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 15:33:19.824854    7817 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 15:33:20.324914    7817 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 15:33:20.824611    7817 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 15:33:20.844505    7817 api_server.go:72] duration metric: took 1.520282057s to wait for apiserver process to appear ...
	I1218 15:33:20.844518    7817 api_server.go:88] waiting for apiserver healthz status ...
	I1218 15:33:20.844529    7817 api_server.go:253] Checking apiserver healthz at https://192.169.0.43:8443/healthz ...
	I1218 15:33:23.611728    7817 api_server.go:279] https://192.169.0.43:8443/healthz returned 403:
	{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/healthz\"","reason":"Forbidden","details":{},"code":403}
	W1218 15:33:23.611744    7817 api_server.go:103] status: https://192.169.0.43:8443/healthz returned error 403:
	{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/healthz\"","reason":"Forbidden","details":{},"code":403}
	I1218 15:33:23.611753    7817 api_server.go:253] Checking apiserver healthz at https://192.169.0.43:8443/healthz ...
	I1218 15:33:23.691176    7817 api_server.go:279] https://192.169.0.43:8443/healthz returned 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/start-service-ip-repair-controllers ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-system-namespaces-controller ok
	[-]poststarthook/bootstrap-controller failed: reason withheld
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-controller ok
	[+]poststarthook/start-deprecated-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-legacy-token-tracking-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	[+]poststarthook/apiservice-discovery-controller ok
	healthz check failed
	W1218 15:33:23.691193    7817 api_server.go:103] status: https://192.169.0.43:8443/healthz returned error 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/start-service-ip-repair-controllers ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-system-namespaces-controller ok
	[-]poststarthook/bootstrap-controller failed: reason withheld
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-controller ok
	[+]poststarthook/start-deprecated-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-legacy-token-tracking-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	[+]poststarthook/apiservice-discovery-controller ok
	healthz check failed
	I1218 15:33:23.845387    7817 api_server.go:253] Checking apiserver healthz at https://192.169.0.43:8443/healthz ...
	I1218 15:33:23.848838    7817 api_server.go:279] https://192.169.0.43:8443/healthz returned 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/start-service-ip-repair-controllers ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-system-namespaces-controller ok
	[+]poststarthook/bootstrap-controller ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-controller ok
	[+]poststarthook/start-deprecated-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-legacy-token-tracking-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	[+]poststarthook/apiservice-discovery-controller ok
	healthz check failed
	W1218 15:33:23.848850    7817 api_server.go:103] status: https://192.169.0.43:8443/healthz returned error 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/start-service-ip-repair-controllers ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-system-namespaces-controller ok
	[+]poststarthook/bootstrap-controller ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-controller ok
	[+]poststarthook/start-deprecated-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-legacy-token-tracking-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	[+]poststarthook/apiservice-discovery-controller ok
	healthz check failed
	I1218 15:33:24.345977    7817 api_server.go:253] Checking apiserver healthz at https://192.169.0.43:8443/healthz ...
	I1218 15:33:24.350688    7817 api_server.go:279] https://192.169.0.43:8443/healthz returned 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/start-service-ip-repair-controllers ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-system-namespaces-controller ok
	[+]poststarthook/bootstrap-controller ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-controller ok
	[+]poststarthook/start-deprecated-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-legacy-token-tracking-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	[+]poststarthook/apiservice-discovery-controller ok
	healthz check failed
	W1218 15:33:24.350706    7817 api_server.go:103] status: https://192.169.0.43:8443/healthz returned error 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/start-service-ip-repair-controllers ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-system-namespaces-controller ok
	[+]poststarthook/bootstrap-controller ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-controller ok
	[+]poststarthook/start-deprecated-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-legacy-token-tracking-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	[+]poststarthook/apiservice-discovery-controller ok
	healthz check failed
	I1218 15:33:24.845377    7817 api_server.go:253] Checking apiserver healthz at https://192.169.0.43:8443/healthz ...
	I1218 15:33:24.848836    7817 api_server.go:279] https://192.169.0.43:8443/healthz returned 200:
	ok
	I1218 15:33:24.854170    7817 api_server.go:141] control plane version: v1.28.4
	I1218 15:33:24.854183    7817 api_server.go:131] duration metric: took 4.009597163s to wait for apiserver health ...
	I1218 15:33:24.854189    7817 cni.go:84] Creating CNI manager for ""
	I1218 15:33:24.854199    7817 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I1218 15:33:24.879081    7817 out.go:177] * Configuring bridge CNI (Container Networking Interface) ...
	I1218 15:33:24.898873    7817 ssh_runner.go:195] Run: sudo mkdir -p /etc/cni/net.d
	I1218 15:33:24.908998    7817 ssh_runner.go:362] scp memory --> /etc/cni/net.d/1-k8s.conflist (457 bytes)
	I1218 15:33:24.954601    7817 system_pods.go:43] waiting for kube-system pods to appear ...
	I1218 15:33:24.961079    7817 system_pods.go:59] 8 kube-system pods found
	I1218 15:33:24.961096    7817 system_pods.go:61] "coredns-5dd5756b68-pmdqr" [c5cc2159-f5d4-4cd1-ba35-a833725d5564] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I1218 15:33:24.961101    7817 system_pods.go:61] "etcd-embed-certs-732000" [992043fb-8799-4452-89e6-54e852cfb476] Running / Ready:ContainersNotReady (containers with unready status: [etcd]) / ContainersReady:ContainersNotReady (containers with unready status: [etcd])
	I1218 15:33:24.961107    7817 system_pods.go:61] "kube-apiserver-embed-certs-732000" [1fac423a-1817-486d-906e-4e825f18c074] Running / Ready:ContainersNotReady (containers with unready status: [kube-apiserver]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-apiserver])
	I1218 15:33:24.961116    7817 system_pods.go:61] "kube-controller-manager-embed-certs-732000" [6b87c883-a859-4fe9-a87f-a18a9ca3d716] Running / Ready:ContainersNotReady (containers with unready status: [kube-controller-manager]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-controller-manager])
	I1218 15:33:24.961124    7817 system_pods.go:61] "kube-proxy-dc8tp" [7536a37b-0221-497a-9e6a-fec81b93b7d3] Running / Ready:ContainersNotReady (containers with unready status: [kube-proxy]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-proxy])
	I1218 15:33:24.961129    7817 system_pods.go:61] "kube-scheduler-embed-certs-732000" [124a521e-91ef-4057-a111-8deb2acbb101] Running / Ready:ContainersNotReady (containers with unready status: [kube-scheduler]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-scheduler])
	I1218 15:33:24.961134    7817 system_pods.go:61] "metrics-server-57f55c9bc5-dlnhj" [49b4822d-509b-4b6d-8b84-53a07c312705] Pending / Ready:ContainersNotReady (containers with unready status: [metrics-server]) / ContainersReady:ContainersNotReady (containers with unready status: [metrics-server])
	I1218 15:33:24.961139    7817 system_pods.go:61] "storage-provisioner" [32e980ce-bc09-461b-92c8-79caa652901a] Running / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I1218 15:33:24.961144    7817 system_pods.go:74] duration metric: took 6.531819ms to wait for pod list to return data ...
	I1218 15:33:24.961152    7817 node_conditions.go:102] verifying NodePressure condition ...
	I1218 15:33:24.963659    7817 node_conditions.go:122] node storage ephemeral capacity is 17784752Ki
	I1218 15:33:24.963678    7817 node_conditions.go:123] node cpu capacity is 2
	I1218 15:33:24.963689    7817 node_conditions.go:105] duration metric: took 2.533372ms to run NodePressure ...
	I1218 15:33:24.963710    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.28.4:$PATH" kubeadm init phase addon all --config /var/tmp/minikube/kubeadm.yaml"
	I1218 15:33:25.200776    7817 kubeadm.go:772] waiting for restarted kubelet to initialise ...
	I1218 15:33:25.204640    7817 kubeadm.go:787] kubelet initialised
	I1218 15:33:25.204654    7817 kubeadm.go:788] duration metric: took 3.862459ms waiting for restarted kubelet to initialise ...
	I1218 15:33:25.204661    7817 pod_ready.go:35] extra waiting up to 4m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I1218 15:33:25.209556    7817 pod_ready.go:78] waiting up to 4m0s for pod "coredns-5dd5756b68-pmdqr" in "kube-system" namespace to be "Ready" ...
	I1218 15:33:25.214121    7817 pod_ready.go:97] node "embed-certs-732000" hosting pod "coredns-5dd5756b68-pmdqr" in "kube-system" namespace is currently not "Ready" (skipping!): node "embed-certs-732000" has status "Ready":"False"
	I1218 15:33:25.214140    7817 pod_ready.go:81] duration metric: took 4.568449ms waiting for pod "coredns-5dd5756b68-pmdqr" in "kube-system" namespace to be "Ready" ...
	E1218 15:33:25.214148    7817 pod_ready.go:66] WaitExtra: waitPodCondition: node "embed-certs-732000" hosting pod "coredns-5dd5756b68-pmdqr" in "kube-system" namespace is currently not "Ready" (skipping!): node "embed-certs-732000" has status "Ready":"False"
	I1218 15:33:25.214156    7817 pod_ready.go:78] waiting up to 4m0s for pod "etcd-embed-certs-732000" in "kube-system" namespace to be "Ready" ...
	I1218 15:33:25.218529    7817 pod_ready.go:97] node "embed-certs-732000" hosting pod "etcd-embed-certs-732000" in "kube-system" namespace is currently not "Ready" (skipping!): node "embed-certs-732000" has status "Ready":"False"
	I1218 15:33:25.218542    7817 pod_ready.go:81] duration metric: took 4.379557ms waiting for pod "etcd-embed-certs-732000" in "kube-system" namespace to be "Ready" ...
	E1218 15:33:25.218548    7817 pod_ready.go:66] WaitExtra: waitPodCondition: node "embed-certs-732000" hosting pod "etcd-embed-certs-732000" in "kube-system" namespace is currently not "Ready" (skipping!): node "embed-certs-732000" has status "Ready":"False"
	I1218 15:33:25.218553    7817 pod_ready.go:78] waiting up to 4m0s for pod "kube-apiserver-embed-certs-732000" in "kube-system" namespace to be "Ready" ...
	I1218 15:33:25.224002    7817 pod_ready.go:97] node "embed-certs-732000" hosting pod "kube-apiserver-embed-certs-732000" in "kube-system" namespace is currently not "Ready" (skipping!): node "embed-certs-732000" has status "Ready":"False"
	I1218 15:33:25.224015    7817 pod_ready.go:81] duration metric: took 5.454637ms waiting for pod "kube-apiserver-embed-certs-732000" in "kube-system" namespace to be "Ready" ...
	E1218 15:33:25.224022    7817 pod_ready.go:66] WaitExtra: waitPodCondition: node "embed-certs-732000" hosting pod "kube-apiserver-embed-certs-732000" in "kube-system" namespace is currently not "Ready" (skipping!): node "embed-certs-732000" has status "Ready":"False"
	I1218 15:33:25.224027    7817 pod_ready.go:78] waiting up to 4m0s for pod "kube-controller-manager-embed-certs-732000" in "kube-system" namespace to be "Ready" ...
	I1218 15:33:25.357062    7817 pod_ready.go:97] node "embed-certs-732000" hosting pod "kube-controller-manager-embed-certs-732000" in "kube-system" namespace is currently not "Ready" (skipping!): node "embed-certs-732000" has status "Ready":"False"
	I1218 15:33:25.357076    7817 pod_ready.go:81] duration metric: took 133.025725ms waiting for pod "kube-controller-manager-embed-certs-732000" in "kube-system" namespace to be "Ready" ...
	E1218 15:33:25.357083    7817 pod_ready.go:66] WaitExtra: waitPodCondition: node "embed-certs-732000" hosting pod "kube-controller-manager-embed-certs-732000" in "kube-system" namespace is currently not "Ready" (skipping!): node "embed-certs-732000" has status "Ready":"False"
	I1218 15:33:25.357088    7817 pod_ready.go:78] waiting up to 4m0s for pod "kube-proxy-dc8tp" in "kube-system" namespace to be "Ready" ...
	I1218 15:33:25.758311    7817 pod_ready.go:97] node "embed-certs-732000" hosting pod "kube-proxy-dc8tp" in "kube-system" namespace is currently not "Ready" (skipping!): node "embed-certs-732000" has status "Ready":"False"
	I1218 15:33:25.758324    7817 pod_ready.go:81] duration metric: took 401.22441ms waiting for pod "kube-proxy-dc8tp" in "kube-system" namespace to be "Ready" ...
	E1218 15:33:25.758331    7817 pod_ready.go:66] WaitExtra: waitPodCondition: node "embed-certs-732000" hosting pod "kube-proxy-dc8tp" in "kube-system" namespace is currently not "Ready" (skipping!): node "embed-certs-732000" has status "Ready":"False"
	I1218 15:33:25.758343    7817 pod_ready.go:78] waiting up to 4m0s for pod "kube-scheduler-embed-certs-732000" in "kube-system" namespace to be "Ready" ...
	I1218 15:33:26.157742    7817 pod_ready.go:97] node "embed-certs-732000" hosting pod "kube-scheduler-embed-certs-732000" in "kube-system" namespace is currently not "Ready" (skipping!): node "embed-certs-732000" has status "Ready":"False"
	I1218 15:33:26.157756    7817 pod_ready.go:81] duration metric: took 399.401347ms waiting for pod "kube-scheduler-embed-certs-732000" in "kube-system" namespace to be "Ready" ...
	E1218 15:33:26.157763    7817 pod_ready.go:66] WaitExtra: waitPodCondition: node "embed-certs-732000" hosting pod "kube-scheduler-embed-certs-732000" in "kube-system" namespace is currently not "Ready" (skipping!): node "embed-certs-732000" has status "Ready":"False"
	I1218 15:33:26.157768    7817 pod_ready.go:78] waiting up to 4m0s for pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace to be "Ready" ...
	I1218 15:33:26.559369    7817 pod_ready.go:97] node "embed-certs-732000" hosting pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace is currently not "Ready" (skipping!): node "embed-certs-732000" has status "Ready":"False"
	I1218 15:33:26.559386    7817 pod_ready.go:81] duration metric: took 401.602706ms waiting for pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace to be "Ready" ...
	E1218 15:33:26.559395    7817 pod_ready.go:66] WaitExtra: waitPodCondition: node "embed-certs-732000" hosting pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace is currently not "Ready" (skipping!): node "embed-certs-732000" has status "Ready":"False"
	I1218 15:33:26.559411    7817 pod_ready.go:38] duration metric: took 1.354720099s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I1218 15:33:26.559431    7817 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I1218 15:33:26.568450    7817 ops.go:34] apiserver oom_adj: -16
	I1218 15:33:26.568460    7817 kubeadm.go:640] restartCluster took 18.513497926s
	I1218 15:33:26.568465    7817 kubeadm.go:406] StartCluster complete in 18.533125287s
	I1218 15:33:26.568474    7817 settings.go:142] acquiring lock: {Name:mk74e01bdc7838c9a52e5871268057daa99735d4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 15:33:26.568544    7817 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/17822-999/kubeconfig
	I1218 15:33:26.569328    7817 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/17822-999/kubeconfig: {Name:mk8154cfbf2a2bcb7be4f33617ae805aa580a4da Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 15:33:26.569598    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I1218 15:33:26.569652    7817 addons.go:499] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:true default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:true nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false storage-provisioner-rancher:false volumesnapshots:false]
	I1218 15:33:26.569693    7817 addons.go:69] Setting storage-provisioner=true in profile "embed-certs-732000"
	I1218 15:33:26.569709    7817 addons.go:231] Setting addon storage-provisioner=true in "embed-certs-732000"
	I1218 15:33:26.569709    7817 addons.go:69] Setting default-storageclass=true in profile "embed-certs-732000"
	W1218 15:33:26.569715    7817 addons.go:240] addon storage-provisioner should already be in state true
	I1218 15:33:26.569724    7817 addons.go:69] Setting metrics-server=true in profile "embed-certs-732000"
	I1218 15:33:26.569737    7817 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "embed-certs-732000"
	I1218 15:33:26.569738    7817 config.go:182] Loaded profile config "embed-certs-732000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
	I1218 15:33:26.569759    7817 addons.go:231] Setting addon metrics-server=true in "embed-certs-732000"
	I1218 15:33:26.569762    7817 host.go:66] Checking if "embed-certs-732000" exists ...
	I1218 15:33:26.569740    7817 addons.go:69] Setting dashboard=true in profile "embed-certs-732000"
	W1218 15:33:26.569772    7817 addons.go:240] addon metrics-server should already be in state true
	I1218 15:33:26.569787    7817 addons.go:231] Setting addon dashboard=true in "embed-certs-732000"
	W1218 15:33:26.569809    7817 addons.go:240] addon dashboard should already be in state true
	I1218 15:33:26.569828    7817 host.go:66] Checking if "embed-certs-732000" exists ...
	I1218 15:33:26.569878    7817 host.go:66] Checking if "embed-certs-732000" exists ...
	I1218 15:33:26.570003    7817 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 15:33:26.570015    7817 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 15:33:26.570028    7817 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 15:33:26.570033    7817 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 15:33:26.570128    7817 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 15:33:26.570163    7817 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 15:33:26.570162    7817 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 15:33:26.570665    7817 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 15:33:26.579813    7817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:56582
	I1218 15:33:26.579872    7817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:56583
	I1218 15:33:26.582076    7817 main.go:141] libmachine: () Calling .GetVersion
	I1218 15:33:26.582181    7817 main.go:141] libmachine: () Calling .GetVersion
	I1218 15:33:26.582264    7817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:56586
	I1218 15:33:26.582529    7817 main.go:141] libmachine: Using API Version  1
	I1218 15:33:26.582541    7817 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 15:33:26.582634    7817 main.go:141] libmachine: () Calling .GetVersion
	I1218 15:33:26.582641    7817 main.go:141] libmachine: Using API Version  1
	I1218 15:33:26.582672    7817 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 15:33:26.582808    7817 main.go:141] libmachine: () Calling .GetMachineName
	I1218 15:33:26.582942    7817 main.go:141] libmachine: () Calling .GetMachineName
	I1218 15:33:26.582991    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetState
	I1218 15:33:26.583020    7817 main.go:141] libmachine: Using API Version  1
	I1218 15:33:26.583036    7817 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 15:33:26.583116    7817 main.go:141] libmachine: (embed-certs-732000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:33:26.583207    7817 main.go:141] libmachine: (embed-certs-732000) DBG | hyperkit pid from json: 7828
	I1218 15:33:26.583293    7817 main.go:141] libmachine: () Calling .GetMachineName
	I1218 15:33:26.583329    7817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:56588
	I1218 15:33:26.583388    7817 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 15:33:26.583415    7817 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 15:33:26.584233    7817 main.go:141] libmachine: () Calling .GetVersion
	I1218 15:33:26.584520    7817 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 15:33:26.584610    7817 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 15:33:26.586668    7817 kapi.go:248] "coredns" deployment in "kube-system" namespace and "embed-certs-732000" context rescaled to 1 replicas
	I1218 15:33:26.586668    7817 main.go:141] libmachine: Using API Version  1
	I1218 15:33:26.586784    7817 start.go:223] Will wait 6m0s for node &{Name: IP:192.169.0.43 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:docker ControlPlane:true Worker:true}
	I1218 15:33:26.586840    7817 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 15:33:26.607786    7817 out.go:177] * Verifying Kubernetes components...
	I1218 15:33:26.587915    7817 main.go:141] libmachine: () Calling .GetMachineName
	I1218 15:33:26.588074    7817 addons.go:231] Setting addon default-storageclass=true in "embed-certs-732000"
	W1218 15:33:26.649341    7817 addons.go:240] addon default-storageclass should already be in state true
	I1218 15:33:26.649363    7817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1218 15:33:26.649367    7817 host.go:66] Checking if "embed-certs-732000" exists ...
	I1218 15:33:26.592949    7817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:56590
	I1218 15:33:26.593741    7817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:56591
	I1218 15:33:26.608252    7817 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 15:33:26.649490    7817 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 15:33:26.649705    7817 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 15:33:26.650321    7817 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 15:33:26.650483    7817 main.go:141] libmachine: () Calling .GetVersion
	I1218 15:33:26.650627    7817 main.go:141] libmachine: () Calling .GetVersion
	I1218 15:33:26.653228    7817 main.go:141] libmachine: Using API Version  1
	I1218 15:33:26.653269    7817 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 15:33:26.653387    7817 main.go:141] libmachine: Using API Version  1
	I1218 15:33:26.653407    7817 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 15:33:26.653574    7817 main.go:141] libmachine: () Calling .GetMachineName
	I1218 15:33:26.653677    7817 main.go:141] libmachine: () Calling .GetMachineName
	I1218 15:33:26.653695    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetState
	I1218 15:33:26.653807    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetState
	I1218 15:33:26.653809    7817 main.go:141] libmachine: (embed-certs-732000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:33:26.653911    7817 main.go:141] libmachine: (embed-certs-732000) DBG | hyperkit pid from json: 7828
	I1218 15:33:26.653923    7817 main.go:141] libmachine: (embed-certs-732000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:33:26.653977    7817 main.go:141] libmachine: (embed-certs-732000) DBG | hyperkit pid from json: 7828
	I1218 15:33:26.655004    7817 main.go:141] libmachine: (embed-certs-732000) Calling .DriverName
	I1218 15:33:26.655127    7817 main.go:141] libmachine: (embed-certs-732000) Calling .DriverName
	I1218 15:33:26.676525    7817 out.go:177]   - Using image fake.domain/registry.k8s.io/echoserver:1.4
	I1218 15:33:26.697524    7817 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1218 15:33:26.659671    7817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:56595
	I1218 15:33:26.659688    7817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:56594
	I1218 15:33:26.697944    7817 main.go:141] libmachine: () Calling .GetVersion
	I1218 15:33:26.718444    7817 addons.go:423] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 15:33:26.739563    7817 addons.go:423] installing /etc/kubernetes/addons/metrics-apiservice.yaml
	I1218 15:33:26.739566    7817 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1218 15:33:26.739574    7817 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/metrics-apiservice.yaml (424 bytes)
	I1218 15:33:26.739583    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHHostname
	I1218 15:33:26.739583    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHHostname
	I1218 15:33:26.730728    7817 node_ready.go:35] waiting up to 6m0s for node "embed-certs-732000" to be "Ready" ...
	I1218 15:33:26.730823    7817 start.go:902] CoreDNS already contains "host.minikube.internal" host record, skipping...
	I1218 15:33:26.718787    7817 main.go:141] libmachine: () Calling .GetVersion
	I1218 15:33:26.739727    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHPort
	I1218 15:33:26.739744    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHPort
	I1218 15:33:26.739850    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHKeyPath
	I1218 15:33:26.739869    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHKeyPath
	I1218 15:33:26.739961    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHUsername
	I1218 15:33:26.739971    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHUsername
	I1218 15:33:26.739995    7817 main.go:141] libmachine: Using API Version  1
	I1218 15:33:26.740013    7817 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 15:33:26.740058    7817 main.go:141] libmachine: Using API Version  1
	I1218 15:33:26.740073    7817 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 15:33:26.740072    7817 sshutil.go:53] new ssh client: &{IP:192.169.0.43 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000/id_rsa Username:docker}
	I1218 15:33:26.740079    7817 sshutil.go:53] new ssh client: &{IP:192.169.0.43 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000/id_rsa Username:docker}
	I1218 15:33:26.740268    7817 main.go:141] libmachine: () Calling .GetMachineName
	I1218 15:33:26.740295    7817 main.go:141] libmachine: () Calling .GetMachineName
	I1218 15:33:26.740385    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetState
	I1218 15:33:26.740476    7817 main.go:141] libmachine: (embed-certs-732000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:33:26.740540    7817 main.go:141] libmachine: (embed-certs-732000) DBG | hyperkit pid from json: 7828
	I1218 15:33:26.740663    7817 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 15:33:26.740677    7817 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 15:33:26.742101    7817 main.go:141] libmachine: (embed-certs-732000) Calling .DriverName
	I1218 15:33:26.763518    7817 out.go:177]   - Using image docker.io/kubernetesui/dashboard:v2.7.0
	I1218 15:33:26.749006    7817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:56600
	I1218 15:33:26.763895    7817 main.go:141] libmachine: () Calling .GetVersion
	I1218 15:33:26.790242    7817 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 15:33:26.790607    7817 addons.go:423] installing /etc/kubernetes/addons/metrics-server-deployment.yaml
	I1218 15:33:26.804523    7817 out.go:177]   - Using image registry.k8s.io/echoserver:1.4
	I1218 15:33:26.804530    7817 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/metrics-server-deployment.yaml (1825 bytes)
	I1218 15:33:26.804968    7817 main.go:141] libmachine: Using API Version  1
	I1218 15:33:26.825566    7817 addons.go:423] installing /etc/kubernetes/addons/dashboard-ns.yaml
	I1218 15:33:26.817178    7817 addons.go:423] installing /etc/kubernetes/addons/metrics-server-rbac.yaml
	I1218 15:33:26.825576    7817 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-ns.yaml (759 bytes)
	I1218 15:33:26.825582    7817 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 15:33:26.825590    7817 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/metrics-server-rbac.yaml (2175 bytes)
	I1218 15:33:26.825599    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHHostname
	I1218 15:33:26.825796    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHPort
	I1218 15:33:26.825921    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHKeyPath
	I1218 15:33:26.825934    7817 main.go:141] libmachine: () Calling .GetMachineName
	I1218 15:33:26.826076    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHUsername
	I1218 15:33:26.826092    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetState
	I1218 15:33:26.826194    7817 main.go:141] libmachine: (embed-certs-732000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:33:26.826195    7817 sshutil.go:53] new ssh client: &{IP:192.169.0.43 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000/id_rsa Username:docker}
	I1218 15:33:26.826283    7817 main.go:141] libmachine: (embed-certs-732000) DBG | hyperkit pid from json: 7828
	I1218 15:33:26.827399    7817 main.go:141] libmachine: (embed-certs-732000) Calling .DriverName
	I1218 15:33:26.827578    7817 addons.go:423] installing /etc/kubernetes/addons/storageclass.yaml
	I1218 15:33:26.827586    7817 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1218 15:33:26.827597    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHHostname
	I1218 15:33:26.827700    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHPort
	I1218 15:33:26.827782    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHKeyPath
	I1218 15:33:26.827860    7817 main.go:141] libmachine: (embed-certs-732000) Calling .GetSSHUsername
	I1218 15:33:26.827948    7817 sshutil.go:53] new ssh client: &{IP:192.169.0.43 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17822-999/.minikube/machines/embed-certs-732000/id_rsa Username:docker}
	I1218 15:33:26.845784    7817 addons.go:423] installing /etc/kubernetes/addons/metrics-server-service.yaml
	I1218 15:33:26.845796    7817 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/metrics-server-service.yaml (446 bytes)
	I1218 15:33:26.866315    7817 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/metrics-apiservice.yaml -f /etc/kubernetes/addons/metrics-server-deployment.yaml -f /etc/kubernetes/addons/metrics-server-rbac.yaml -f /etc/kubernetes/addons/metrics-server-service.yaml
	I1218 15:33:26.918489    7817 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1218 15:33:26.924457    7817 addons.go:423] installing /etc/kubernetes/addons/dashboard-clusterrole.yaml
	I1218 15:33:26.924468    7817 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-clusterrole.yaml (1001 bytes)
	I1218 15:33:26.999410    7817 addons.go:423] installing /etc/kubernetes/addons/dashboard-clusterrolebinding.yaml
	I1218 15:33:26.999423    7817 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-clusterrolebinding.yaml (1018 bytes)
	I1218 15:33:27.017723    7817 addons.go:423] installing /etc/kubernetes/addons/dashboard-configmap.yaml
	I1218 15:33:27.017735    7817 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-configmap.yaml (837 bytes)
	I1218 15:33:27.046456    7817 addons.go:423] installing /etc/kubernetes/addons/dashboard-dp.yaml
	I1218 15:33:27.046474    7817 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-dp.yaml (4201 bytes)
	I1218 15:33:27.085338    7817 addons.go:423] installing /etc/kubernetes/addons/dashboard-role.yaml
	I1218 15:33:27.085350    7817 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-role.yaml (1724 bytes)
	I1218 15:33:27.143874    7817 addons.go:423] installing /etc/kubernetes/addons/dashboard-rolebinding.yaml
	I1218 15:33:27.143886    7817 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-rolebinding.yaml (1046 bytes)
	I1218 15:33:27.165319    7817 addons.go:423] installing /etc/kubernetes/addons/dashboard-sa.yaml
	I1218 15:33:27.165331    7817 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-sa.yaml (837 bytes)
	I1218 15:33:27.179128    7817 addons.go:423] installing /etc/kubernetes/addons/dashboard-secret.yaml
	I1218 15:33:27.179140    7817 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-secret.yaml (1389 bytes)
	I1218 15:33:27.190607    7817 addons.go:423] installing /etc/kubernetes/addons/dashboard-svc.yaml
	I1218 15:33:27.190618    7817 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-svc.yaml (1294 bytes)
	I1218 15:33:27.201934    7817 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/dashboard-ns.yaml -f /etc/kubernetes/addons/dashboard-clusterrole.yaml -f /etc/kubernetes/addons/dashboard-clusterrolebinding.yaml -f /etc/kubernetes/addons/dashboard-configmap.yaml -f /etc/kubernetes/addons/dashboard-dp.yaml -f /etc/kubernetes/addons/dashboard-role.yaml -f /etc/kubernetes/addons/dashboard-rolebinding.yaml -f /etc/kubernetes/addons/dashboard-sa.yaml -f /etc/kubernetes/addons/dashboard-secret.yaml -f /etc/kubernetes/addons/dashboard-svc.yaml
	I1218 15:33:28.057319    7817 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: (1.252729887s)
	I1218 15:33:28.057351    7817 main.go:141] libmachine: Making call to close driver server
	I1218 15:33:28.057371    7817 main.go:141] libmachine: (embed-certs-732000) Calling .Close
	I1218 15:33:28.057556    7817 main.go:141] libmachine: Successfully made call to close driver server
	I1218 15:33:28.057567    7817 main.go:141] libmachine: Making call to close connection to plugin binary
	I1218 15:33:28.057574    7817 main.go:141] libmachine: Making call to close driver server
	I1218 15:33:28.057579    7817 main.go:141] libmachine: (embed-certs-732000) Calling .Close
	I1218 15:33:28.057583    7817 main.go:141] libmachine: (embed-certs-732000) DBG | Closing plugin on server side
	I1218 15:33:28.057717    7817 main.go:141] libmachine: Successfully made call to close driver server
	I1218 15:33:28.057731    7817 main.go:141] libmachine: Making call to close connection to plugin binary
	I1218 15:33:28.102153    7817 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: (1.183624019s)
	I1218 15:33:28.102183    7817 main.go:141] libmachine: Making call to close driver server
	I1218 15:33:28.102192    7817 main.go:141] libmachine: (embed-certs-732000) Calling .Close
	I1218 15:33:28.102237    7817 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/metrics-apiservice.yaml -f /etc/kubernetes/addons/metrics-server-deployment.yaml -f /etc/kubernetes/addons/metrics-server-rbac.yaml -f /etc/kubernetes/addons/metrics-server-service.yaml: (1.235880839s)
	I1218 15:33:28.102261    7817 main.go:141] libmachine: Making call to close driver server
	I1218 15:33:28.102272    7817 main.go:141] libmachine: (embed-certs-732000) Calling .Close
	I1218 15:33:28.102439    7817 main.go:141] libmachine: (embed-certs-732000) DBG | Closing plugin on server side
	I1218 15:33:28.102448    7817 main.go:141] libmachine: Successfully made call to close driver server
	I1218 15:33:28.102475    7817 main.go:141] libmachine: (embed-certs-732000) DBG | Closing plugin on server side
	I1218 15:33:28.102481    7817 main.go:141] libmachine: Successfully made call to close driver server
	I1218 15:33:28.102485    7817 main.go:141] libmachine: Making call to close connection to plugin binary
	I1218 15:33:28.102491    7817 main.go:141] libmachine: Making call to close connection to plugin binary
	I1218 15:33:28.102502    7817 main.go:141] libmachine: Making call to close driver server
	I1218 15:33:28.102504    7817 main.go:141] libmachine: Making call to close driver server
	I1218 15:33:28.102510    7817 main.go:141] libmachine: (embed-certs-732000) Calling .Close
	I1218 15:33:28.102513    7817 main.go:141] libmachine: (embed-certs-732000) Calling .Close
	I1218 15:33:28.102718    7817 main.go:141] libmachine: Successfully made call to close driver server
	I1218 15:33:28.102735    7817 main.go:141] libmachine: Making call to close connection to plugin binary
	I1218 15:33:28.102743    7817 addons.go:467] Verifying addon metrics-server=true in "embed-certs-732000"
	I1218 15:33:28.102755    7817 main.go:141] libmachine: (embed-certs-732000) DBG | Closing plugin on server side
	I1218 15:33:28.102776    7817 main.go:141] libmachine: Successfully made call to close driver server
	I1218 15:33:28.102792    7817 main.go:141] libmachine: Making call to close connection to plugin binary
	I1218 15:33:28.108318    7817 main.go:141] libmachine: Making call to close driver server
	I1218 15:33:28.108332    7817 main.go:141] libmachine: (embed-certs-732000) Calling .Close
	I1218 15:33:28.108477    7817 main.go:141] libmachine: Successfully made call to close driver server
	I1218 15:33:28.108489    7817 main.go:141] libmachine: Making call to close connection to plugin binary
	I1218 15:33:28.108498    7817 main.go:141] libmachine: (embed-certs-732000) DBG | Closing plugin on server side
	I1218 15:33:28.321295    7817 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/dashboard-ns.yaml -f /etc/kubernetes/addons/dashboard-clusterrole.yaml -f /etc/kubernetes/addons/dashboard-clusterrolebinding.yaml -f /etc/kubernetes/addons/dashboard-configmap.yaml -f /etc/kubernetes/addons/dashboard-dp.yaml -f /etc/kubernetes/addons/dashboard-role.yaml -f /etc/kubernetes/addons/dashboard-rolebinding.yaml -f /etc/kubernetes/addons/dashboard-sa.yaml -f /etc/kubernetes/addons/dashboard-secret.yaml -f /etc/kubernetes/addons/dashboard-svc.yaml: (1.119311183s)
	I1218 15:33:28.321327    7817 main.go:141] libmachine: Making call to close driver server
	I1218 15:33:28.321341    7817 main.go:141] libmachine: (embed-certs-732000) Calling .Close
	I1218 15:33:28.321509    7817 main.go:141] libmachine: Successfully made call to close driver server
	I1218 15:33:28.321519    7817 main.go:141] libmachine: Making call to close connection to plugin binary
	I1218 15:33:28.321528    7817 main.go:141] libmachine: Making call to close driver server
	I1218 15:33:28.321535    7817 main.go:141] libmachine: (embed-certs-732000) Calling .Close
	I1218 15:33:28.321639    7817 main.go:141] libmachine: Successfully made call to close driver server
	I1218 15:33:28.321647    7817 main.go:141] libmachine: Making call to close connection to plugin binary
	I1218 15:33:28.321659    7817 main.go:141] libmachine: (embed-certs-732000) DBG | Closing plugin on server side
	I1218 15:33:28.344232    7817 out.go:177] * Some dashboard features require the metrics-server addon. To enable all features please run:
	
		minikube -p embed-certs-732000 addons enable metrics-server	
	
	
	I1218 15:33:28.401024    7817 out.go:177] * Enabled addons: storage-provisioner, metrics-server, default-storageclass, dashboard
	I1218 15:33:28.475224    7817 addons.go:502] enable addons completed in 1.905559323s: enabled=[storage-provisioner metrics-server default-storageclass dashboard]
	I1218 15:33:28.744977    7817 node_ready.go:58] node "embed-certs-732000" has status "Ready":"False"
	I1218 15:33:31.245001    7817 node_ready.go:58] node "embed-certs-732000" has status "Ready":"False"
	I1218 15:33:33.742462    7817 node_ready.go:58] node "embed-certs-732000" has status "Ready":"False"
	I1218 15:33:34.245267    7817 node_ready.go:49] node "embed-certs-732000" has status "Ready":"True"
	I1218 15:33:34.245287    7817 node_ready.go:38] duration metric: took 7.505530618s waiting for node "embed-certs-732000" to be "Ready" ...
	I1218 15:33:34.245299    7817 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I1218 15:33:34.249219    7817 pod_ready.go:78] waiting up to 6m0s for pod "coredns-5dd5756b68-pmdqr" in "kube-system" namespace to be "Ready" ...
	I1218 15:33:34.252328    7817 pod_ready.go:92] pod "coredns-5dd5756b68-pmdqr" in "kube-system" namespace has status "Ready":"True"
	I1218 15:33:34.252337    7817 pod_ready.go:81] duration metric: took 3.107093ms waiting for pod "coredns-5dd5756b68-pmdqr" in "kube-system" namespace to be "Ready" ...
	I1218 15:33:34.252343    7817 pod_ready.go:78] waiting up to 6m0s for pod "etcd-embed-certs-732000" in "kube-system" namespace to be "Ready" ...
	I1218 15:33:36.259106    7817 pod_ready.go:102] pod "etcd-embed-certs-732000" in "kube-system" namespace has status "Ready":"False"
	I1218 15:33:38.757548    7817 pod_ready.go:92] pod "etcd-embed-certs-732000" in "kube-system" namespace has status "Ready":"True"
	I1218 15:33:38.757560    7817 pod_ready.go:81] duration metric: took 4.505139841s waiting for pod "etcd-embed-certs-732000" in "kube-system" namespace to be "Ready" ...
	I1218 15:33:38.757570    7817 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-embed-certs-732000" in "kube-system" namespace to be "Ready" ...
	I1218 15:33:38.761393    7817 pod_ready.go:92] pod "kube-apiserver-embed-certs-732000" in "kube-system" namespace has status "Ready":"True"
	I1218 15:33:38.761402    7817 pod_ready.go:81] duration metric: took 3.827592ms waiting for pod "kube-apiserver-embed-certs-732000" in "kube-system" namespace to be "Ready" ...
	I1218 15:33:38.761409    7817 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-embed-certs-732000" in "kube-system" namespace to be "Ready" ...
	I1218 15:33:38.768381    7817 pod_ready.go:92] pod "kube-controller-manager-embed-certs-732000" in "kube-system" namespace has status "Ready":"True"
	I1218 15:33:38.768393    7817 pod_ready.go:81] duration metric: took 6.978696ms waiting for pod "kube-controller-manager-embed-certs-732000" in "kube-system" namespace to be "Ready" ...
	I1218 15:33:38.768400    7817 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-dc8tp" in "kube-system" namespace to be "Ready" ...
	I1218 15:33:38.772290    7817 pod_ready.go:92] pod "kube-proxy-dc8tp" in "kube-system" namespace has status "Ready":"True"
	I1218 15:33:38.772301    7817 pod_ready.go:81] duration metric: took 3.890777ms waiting for pod "kube-proxy-dc8tp" in "kube-system" namespace to be "Ready" ...
	I1218 15:33:38.772307    7817 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-embed-certs-732000" in "kube-system" namespace to be "Ready" ...
	I1218 15:33:38.776143    7817 pod_ready.go:92] pod "kube-scheduler-embed-certs-732000" in "kube-system" namespace has status "Ready":"True"
	I1218 15:33:38.776152    7817 pod_ready.go:81] duration metric: took 3.839279ms waiting for pod "kube-scheduler-embed-certs-732000" in "kube-system" namespace to be "Ready" ...
	I1218 15:33:38.776158    7817 pod_ready.go:78] waiting up to 6m0s for pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace to be "Ready" ...
	I1218 15:33:40.782223    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:33:43.281601    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:33:45.781784    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:33:47.781881    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:33:50.285096    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:33:52.781635    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:33:54.781936    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:33:56.783407    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:33:59.282169    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:34:01.782713    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:34:04.282743    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:34:06.283221    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:34:08.283831    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:34:10.781041    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:34:12.783219    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:34:15.281051    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:34:17.283420    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:34:19.782729    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:34:21.782847    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:34:24.283260    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:34:26.782795    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:34:29.281369    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:34:31.288659    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:34:33.783472    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:34:36.283320    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:34:38.284098    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:34:40.286748    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:34:42.287842    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:34:44.783934    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:34:47.282944    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:34:49.284757    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:34:51.782014    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:34:54.282435    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:34:56.782932    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:34:58.783090    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:35:01.284090    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:35:03.783801    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:35:06.282744    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:35:08.283683    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:35:10.782024    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:35:12.782886    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:35:15.284246    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:35:17.784269    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:35:20.284168    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:35:22.782511    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:35:24.783013    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:35:26.783168    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:35:29.282173    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:35:31.782585    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:35:33.784299    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:35:36.284373    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:35:38.784335    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:35:40.784767    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:35:43.284866    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:35:45.784134    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:35:47.785126    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:35:50.283899    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:35:52.284262    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:35:54.285438    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:35:56.286081    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:35:58.785587    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:36:01.285388    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:36:03.782854    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:36:05.785653    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:36:08.282830    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:36:10.782857    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:36:12.785167    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:36:15.282947    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:36:17.283714    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:36:19.783709    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:36:21.784452    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:36:23.785609    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:36:25.786338    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:36:28.282192    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:36:30.284335    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:36:32.784553    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:36:35.283196    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:36:37.283752    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:36:39.285213    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:36:41.783779    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:36:43.786157    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:36:46.283887    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:36:48.284290    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:36:50.784271    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:36:52.784890    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:36:55.283096    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:36:57.284287    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:36:59.288659    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:37:01.784476    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:37:03.785853    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:37:06.286408    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:37:08.286779    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:37:10.787546    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:37:13.284254    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:37:15.286698    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:37:17.784708    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:37:19.787574    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:37:22.283282    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:37:24.283399    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:37:26.286840    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:37:28.785197    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:37:31.285933    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:37:33.785433    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:37:35.786056    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:37:38.286797    7817 pod_ready.go:102] pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace has status "Ready":"False"
	I1218 15:37:38.785891    7817 pod_ready.go:81] duration metric: took 4m0.005914982s waiting for pod "metrics-server-57f55c9bc5-dlnhj" in "kube-system" namespace to be "Ready" ...
	E1218 15:37:38.785904    7817 pod_ready.go:66] WaitExtra: waitPodCondition: context deadline exceeded
	I1218 15:37:38.785909    7817 pod_ready.go:38] duration metric: took 4m4.536720988s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I1218 15:37:38.785921    7817 api_server.go:52] waiting for apiserver process to appear ...
	I1218 15:37:38.786004    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-apiserver --format={{.ID}}
	I1218 15:37:38.799679    7817 logs.go:284] 2 containers: [0835519b2271 d7a131de94f2]
	I1218 15:37:38.799756    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_etcd --format={{.ID}}
	I1218 15:37:38.816812    7817 logs.go:284] 2 containers: [f705bd0f78ae c1c34392f2c4]
	I1218 15:37:38.816885    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_coredns --format={{.ID}}
	I1218 15:37:38.830989    7817 logs.go:284] 2 containers: [7786b16f143d e5bd7d5d2abc]
	I1218 15:37:38.831068    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-scheduler --format={{.ID}}
	I1218 15:37:38.845474    7817 logs.go:284] 2 containers: [e919d704eb71 8ed2d9614b64]
	I1218 15:37:38.845551    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-proxy --format={{.ID}}
	I1218 15:37:38.859222    7817 logs.go:284] 2 containers: [3464ef616d4c c5f46b946942]
	I1218 15:37:38.859297    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-controller-manager --format={{.ID}}
	I1218 15:37:38.872741    7817 logs.go:284] 2 containers: [dcc04a5cd29c c4ac6546cfdf]
	I1218 15:37:38.872818    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kindnet --format={{.ID}}
	I1218 15:37:38.885331    7817 logs.go:284] 0 containers: []
	W1218 15:37:38.885343    7817 logs.go:286] No container was found matching "kindnet"
	I1218 15:37:38.885405    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kubernetes-dashboard --format={{.ID}}
	I1218 15:37:38.898967    7817 logs.go:284] 1 containers: [031281b5232b]
	I1218 15:37:38.899041    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_storage-provisioner --format={{.ID}}
	I1218 15:37:38.912280    7817 logs.go:284] 2 containers: [3c3c3d79bae5 5f94eed8fe93]
	I1218 15:37:38.912297    7817 logs.go:123] Gathering logs for kube-apiserver [d7a131de94f2] ...
	I1218 15:37:38.912305    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 d7a131de94f2"
	I1218 15:37:38.940226    7817 logs.go:123] Gathering logs for etcd [c1c34392f2c4] ...
	I1218 15:37:38.940243    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 c1c34392f2c4"
	I1218 15:37:38.959972    7817 logs.go:123] Gathering logs for kube-scheduler [e919d704eb71] ...
	I1218 15:37:38.959986    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 e919d704eb71"
	I1218 15:37:38.976334    7817 logs.go:123] Gathering logs for storage-provisioner [5f94eed8fe93] ...
	I1218 15:37:38.976347    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 5f94eed8fe93"
	I1218 15:37:38.991308    7817 logs.go:123] Gathering logs for dmesg ...
	I1218 15:37:38.991321    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 15:37:39.000033    7817 logs.go:123] Gathering logs for describe nodes ...
	I1218 15:37:39.000046    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I1218 15:37:39.101798    7817 logs.go:123] Gathering logs for kube-apiserver [0835519b2271] ...
	I1218 15:37:39.101814    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 0835519b2271"
	I1218 15:37:39.125123    7817 logs.go:123] Gathering logs for etcd [f705bd0f78ae] ...
	I1218 15:37:39.125138    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 f705bd0f78ae"
	I1218 15:37:39.145028    7817 logs.go:123] Gathering logs for kube-controller-manager [c4ac6546cfdf] ...
	I1218 15:37:39.145041    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 c4ac6546cfdf"
	I1218 15:37:39.171019    7817 logs.go:123] Gathering logs for kubernetes-dashboard [031281b5232b] ...
	I1218 15:37:39.171033    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 031281b5232b"
	I1218 15:37:39.186241    7817 logs.go:123] Gathering logs for storage-provisioner [3c3c3d79bae5] ...
	I1218 15:37:39.186255    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3c3c3d79bae5"
	I1218 15:37:39.201938    7817 logs.go:123] Gathering logs for Docker ...
	I1218 15:37:39.201953    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u docker -u cri-docker -n 400"
	I1218 15:37:39.241437    7817 logs.go:123] Gathering logs for coredns [e5bd7d5d2abc] ...
	I1218 15:37:39.241451    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 e5bd7d5d2abc"
	I1218 15:37:39.256499    7817 logs.go:123] Gathering logs for kube-scheduler [8ed2d9614b64] ...
	I1218 15:37:39.256513    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 8ed2d9614b64"
	I1218 15:37:39.277937    7817 logs.go:123] Gathering logs for kube-proxy [3464ef616d4c] ...
	I1218 15:37:39.277952    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3464ef616d4c"
	I1218 15:37:39.292704    7817 logs.go:123] Gathering logs for kube-proxy [c5f46b946942] ...
	I1218 15:37:39.292718    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 c5f46b946942"
	I1218 15:37:39.310316    7817 logs.go:123] Gathering logs for container status ...
	I1218 15:37:39.310331    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 15:37:39.356096    7817 logs.go:123] Gathering logs for kubelet ...
	I1218 15:37:39.356110    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	W1218 15:37:39.384503    7817 logs.go:138] Found kubelet problem: Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: W1218 23:33:36.405491    1332 reflector.go:535] object-"kubernetes-dashboard"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	W1218 15:37:39.384628    7817 logs.go:138] Found kubelet problem: Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: E1218 23:33:36.405604    1332 reflector.go:147] object-"kubernetes-dashboard"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	I1218 15:37:39.400023    7817 logs.go:123] Gathering logs for coredns [7786b16f143d] ...
	I1218 15:37:39.400037    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 7786b16f143d"
	I1218 15:37:39.418252    7817 logs.go:123] Gathering logs for kube-controller-manager [dcc04a5cd29c] ...
	I1218 15:37:39.418266    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 dcc04a5cd29c"
	I1218 15:37:39.446553    7817 out.go:309] Setting ErrFile to fd 2...
	I1218 15:37:39.446569    7817 out.go:343] TERM=,COLORTERM=, which probably does not support color
	W1218 15:37:39.446600    7817 out.go:239] X Problems detected in kubelet:
	X Problems detected in kubelet:
	W1218 15:37:39.446607    7817 out.go:239]   Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: W1218 23:33:36.405491    1332 reflector.go:535] object-"kubernetes-dashboard"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	  Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: W1218 23:33:36.405491    1332 reflector.go:535] object-"kubernetes-dashboard"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	W1218 15:37:39.446621    7817 out.go:239]   Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: E1218 23:33:36.405604    1332 reflector.go:147] object-"kubernetes-dashboard"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	  Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: E1218 23:33:36.405604    1332 reflector.go:147] object-"kubernetes-dashboard"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	I1218 15:37:39.446628    7817 out.go:309] Setting ErrFile to fd 2...
	I1218 15:37:39.446633    7817 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1218 15:37:49.448922    7817 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 15:37:49.459601    7817 api_server.go:72] duration metric: took 4m22.868589993s to wait for apiserver process to appear ...
	I1218 15:37:49.459611    7817 api_server.go:88] waiting for apiserver healthz status ...
	I1218 15:37:49.459683    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-apiserver --format={{.ID}}
	I1218 15:37:49.472881    7817 logs.go:284] 2 containers: [0835519b2271 d7a131de94f2]
	I1218 15:37:49.472955    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_etcd --format={{.ID}}
	I1218 15:37:49.487287    7817 logs.go:284] 2 containers: [f705bd0f78ae c1c34392f2c4]
	I1218 15:37:49.487361    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_coredns --format={{.ID}}
	I1218 15:37:49.501212    7817 logs.go:284] 2 containers: [7786b16f143d e5bd7d5d2abc]
	I1218 15:37:49.501289    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-scheduler --format={{.ID}}
	I1218 15:37:49.514486    7817 logs.go:284] 2 containers: [e919d704eb71 8ed2d9614b64]
	I1218 15:37:49.514562    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-proxy --format={{.ID}}
	I1218 15:37:49.528024    7817 logs.go:284] 2 containers: [3464ef616d4c c5f46b946942]
	I1218 15:37:49.528102    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-controller-manager --format={{.ID}}
	I1218 15:37:49.542144    7817 logs.go:284] 2 containers: [dcc04a5cd29c c4ac6546cfdf]
	I1218 15:37:49.542224    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kindnet --format={{.ID}}
	I1218 15:37:49.555612    7817 logs.go:284] 0 containers: []
	W1218 15:37:49.555625    7817 logs.go:286] No container was found matching "kindnet"
	I1218 15:37:49.555684    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_storage-provisioner --format={{.ID}}
	I1218 15:37:49.569142    7817 logs.go:284] 2 containers: [3c3c3d79bae5 5f94eed8fe93]
	I1218 15:37:49.569216    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kubernetes-dashboard --format={{.ID}}
	I1218 15:37:49.582265    7817 logs.go:284] 1 containers: [031281b5232b]
	I1218 15:37:49.582281    7817 logs.go:123] Gathering logs for kube-scheduler [e919d704eb71] ...
	I1218 15:37:49.582288    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 e919d704eb71"
	I1218 15:37:49.606534    7817 logs.go:123] Gathering logs for kube-controller-manager [dcc04a5cd29c] ...
	I1218 15:37:49.606549    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 dcc04a5cd29c"
	I1218 15:37:49.633472    7817 logs.go:123] Gathering logs for storage-provisioner [5f94eed8fe93] ...
	I1218 15:37:49.633486    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 5f94eed8fe93"
	I1218 15:37:49.650032    7817 logs.go:123] Gathering logs for Docker ...
	I1218 15:37:49.650046    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u docker -u cri-docker -n 400"
	I1218 15:37:49.689025    7817 logs.go:123] Gathering logs for etcd [c1c34392f2c4] ...
	I1218 15:37:49.689040    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 c1c34392f2c4"
	I1218 15:37:49.709214    7817 logs.go:123] Gathering logs for coredns [7786b16f143d] ...
	I1218 15:37:49.709229    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 7786b16f143d"
	I1218 15:37:49.724241    7817 logs.go:123] Gathering logs for kubernetes-dashboard [031281b5232b] ...
	I1218 15:37:49.724255    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 031281b5232b"
	I1218 15:37:49.739676    7817 logs.go:123] Gathering logs for etcd [f705bd0f78ae] ...
	I1218 15:37:49.739690    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 f705bd0f78ae"
	I1218 15:37:49.757974    7817 logs.go:123] Gathering logs for kube-proxy [3464ef616d4c] ...
	I1218 15:37:49.757988    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3464ef616d4c"
	I1218 15:37:49.773472    7817 logs.go:123] Gathering logs for kube-controller-manager [c4ac6546cfdf] ...
	I1218 15:37:49.773486    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 c4ac6546cfdf"
	I1218 15:37:49.799316    7817 logs.go:123] Gathering logs for kube-apiserver [d7a131de94f2] ...
	I1218 15:37:49.799330    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 d7a131de94f2"
	I1218 15:37:49.830127    7817 logs.go:123] Gathering logs for kube-proxy [c5f46b946942] ...
	I1218 15:37:49.830142    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 c5f46b946942"
	I1218 15:37:49.845209    7817 logs.go:123] Gathering logs for describe nodes ...
	I1218 15:37:49.845223    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I1218 15:37:49.931080    7817 logs.go:123] Gathering logs for kube-apiserver [0835519b2271] ...
	I1218 15:37:49.931094    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 0835519b2271"
	I1218 15:37:49.971258    7817 logs.go:123] Gathering logs for coredns [e5bd7d5d2abc] ...
	I1218 15:37:49.971273    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 e5bd7d5d2abc"
	I1218 15:37:49.986016    7817 logs.go:123] Gathering logs for kube-scheduler [8ed2d9614b64] ...
	I1218 15:37:49.986030    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 8ed2d9614b64"
	I1218 15:37:50.005436    7817 logs.go:123] Gathering logs for storage-provisioner [3c3c3d79bae5] ...
	I1218 15:37:50.005449    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3c3c3d79bae5"
	I1218 15:37:50.020467    7817 logs.go:123] Gathering logs for container status ...
	I1218 15:37:50.020479    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 15:37:50.073461    7817 logs.go:123] Gathering logs for kubelet ...
	I1218 15:37:50.073475    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	W1218 15:37:50.101355    7817 logs.go:138] Found kubelet problem: Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: W1218 23:33:36.405491    1332 reflector.go:535] object-"kubernetes-dashboard"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	W1218 15:37:50.101476    7817 logs.go:138] Found kubelet problem: Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: E1218 23:33:36.405604    1332 reflector.go:147] object-"kubernetes-dashboard"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	I1218 15:37:50.117062    7817 logs.go:123] Gathering logs for dmesg ...
	I1218 15:37:50.117071    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 15:37:50.126915    7817 out.go:309] Setting ErrFile to fd 2...
	I1218 15:37:50.126927    7817 out.go:343] TERM=,COLORTERM=, which probably does not support color
	W1218 15:37:50.126954    7817 out.go:239] X Problems detected in kubelet:
	X Problems detected in kubelet:
	W1218 15:37:50.126961    7817 out.go:239]   Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: W1218 23:33:36.405491    1332 reflector.go:535] object-"kubernetes-dashboard"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	  Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: W1218 23:33:36.405491    1332 reflector.go:535] object-"kubernetes-dashboard"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	W1218 15:37:50.126967    7817 out.go:239]   Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: E1218 23:33:36.405604    1332 reflector.go:147] object-"kubernetes-dashboard"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	  Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: E1218 23:33:36.405604    1332 reflector.go:147] object-"kubernetes-dashboard"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	I1218 15:37:50.126972    7817 out.go:309] Setting ErrFile to fd 2...
	I1218 15:37:50.126976    7817 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1218 15:38:00.127950    7817 api_server.go:253] Checking apiserver healthz at https://192.169.0.43:8443/healthz ...
	I1218 15:38:05.129568    7817 api_server.go:269] stopped: https://192.169.0.43:8443/healthz: Get "https://192.169.0.43:8443/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I1218 15:38:05.129747    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-apiserver --format={{.ID}}
	I1218 15:38:05.145514    7817 logs.go:284] 2 containers: [0835519b2271 d7a131de94f2]
	I1218 15:38:05.145603    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_etcd --format={{.ID}}
	I1218 15:38:05.159356    7817 logs.go:284] 2 containers: [f705bd0f78ae c1c34392f2c4]
	I1218 15:38:05.173873    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_coredns --format={{.ID}}
	I1218 15:38:05.188355    7817 logs.go:284] 2 containers: [7786b16f143d e5bd7d5d2abc]
	I1218 15:38:05.188438    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-scheduler --format={{.ID}}
	I1218 15:38:05.202300    7817 logs.go:284] 2 containers: [e919d704eb71 8ed2d9614b64]
	I1218 15:38:05.202371    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-proxy --format={{.ID}}
	I1218 15:38:05.215356    7817 logs.go:284] 2 containers: [3464ef616d4c c5f46b946942]
	I1218 15:38:05.215431    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-controller-manager --format={{.ID}}
	I1218 15:38:05.229452    7817 logs.go:284] 2 containers: [dcc04a5cd29c c4ac6546cfdf]
	I1218 15:38:05.229533    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kindnet --format={{.ID}}
	I1218 15:38:05.242716    7817 logs.go:284] 0 containers: []
	W1218 15:38:05.242729    7817 logs.go:286] No container was found matching "kindnet"
	I1218 15:38:05.242793    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_storage-provisioner --format={{.ID}}
	I1218 15:38:05.256466    7817 logs.go:284] 2 containers: [3c3c3d79bae5 5f94eed8fe93]
	I1218 15:38:05.256541    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kubernetes-dashboard --format={{.ID}}
	I1218 15:38:05.274193    7817 logs.go:284] 1 containers: [031281b5232b]
	I1218 15:38:05.274211    7817 logs.go:123] Gathering logs for kube-proxy [c5f46b946942] ...
	I1218 15:38:05.274219    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 c5f46b946942"
	I1218 15:38:05.289500    7817 logs.go:123] Gathering logs for kubelet ...
	I1218 15:38:05.289515    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	W1218 15:38:05.321078    7817 logs.go:138] Found kubelet problem: Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: W1218 23:33:36.405491    1332 reflector.go:535] object-"kubernetes-dashboard"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	W1218 15:38:05.321198    7817 logs.go:138] Found kubelet problem: Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: E1218 23:33:36.405604    1332 reflector.go:147] object-"kubernetes-dashboard"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	I1218 15:38:05.336456    7817 logs.go:123] Gathering logs for dmesg ...
	I1218 15:38:05.336464    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 15:38:05.345135    7817 logs.go:123] Gathering logs for kube-apiserver [0835519b2271] ...
	I1218 15:38:05.345147    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 0835519b2271"
	I1218 15:38:05.365591    7817 logs.go:123] Gathering logs for kube-apiserver [d7a131de94f2] ...
	I1218 15:38:05.365606    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 d7a131de94f2"
	I1218 15:38:05.391765    7817 logs.go:123] Gathering logs for coredns [e5bd7d5d2abc] ...
	I1218 15:38:05.391779    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 e5bd7d5d2abc"
	I1218 15:38:05.410725    7817 logs.go:123] Gathering logs for kube-scheduler [e919d704eb71] ...
	I1218 15:38:05.410740    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 e919d704eb71"
	I1218 15:38:05.425990    7817 logs.go:123] Gathering logs for kube-scheduler [8ed2d9614b64] ...
	I1218 15:38:05.426004    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 8ed2d9614b64"
	I1218 15:38:05.445527    7817 logs.go:123] Gathering logs for storage-provisioner [3c3c3d79bae5] ...
	I1218 15:38:05.445540    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3c3c3d79bae5"
	I1218 15:38:05.460899    7817 logs.go:123] Gathering logs for describe nodes ...
	I1218 15:38:05.460912    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I1218 15:38:05.541547    7817 logs.go:123] Gathering logs for Docker ...
	I1218 15:38:05.541563    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u docker -u cri-docker -n 400"
	I1218 15:38:05.581764    7817 logs.go:123] Gathering logs for kube-controller-manager [c4ac6546cfdf] ...
	I1218 15:38:05.581777    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 c4ac6546cfdf"
	I1218 15:38:05.607846    7817 logs.go:123] Gathering logs for storage-provisioner [5f94eed8fe93] ...
	I1218 15:38:05.607861    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 5f94eed8fe93"
	I1218 15:38:05.622814    7817 logs.go:123] Gathering logs for kubernetes-dashboard [031281b5232b] ...
	I1218 15:38:05.622830    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 031281b5232b"
	I1218 15:38:05.640522    7817 logs.go:123] Gathering logs for etcd [f705bd0f78ae] ...
	I1218 15:38:05.640536    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 f705bd0f78ae"
	I1218 15:38:05.659008    7817 logs.go:123] Gathering logs for etcd [c1c34392f2c4] ...
	I1218 15:38:05.659027    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 c1c34392f2c4"
	I1218 15:38:05.678681    7817 logs.go:123] Gathering logs for coredns [7786b16f143d] ...
	I1218 15:38:05.678696    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 7786b16f143d"
	I1218 15:38:05.693026    7817 logs.go:123] Gathering logs for kube-proxy [3464ef616d4c] ...
	I1218 15:38:05.693042    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3464ef616d4c"
	I1218 15:38:05.708671    7817 logs.go:123] Gathering logs for kube-controller-manager [dcc04a5cd29c] ...
	I1218 15:38:05.708686    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 dcc04a5cd29c"
	I1218 15:38:05.735577    7817 logs.go:123] Gathering logs for container status ...
	I1218 15:38:05.735591    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 15:38:05.781980    7817 out.go:309] Setting ErrFile to fd 2...
	I1218 15:38:05.781993    7817 out.go:343] TERM=,COLORTERM=, which probably does not support color
	W1218 15:38:05.782022    7817 out.go:239] X Problems detected in kubelet:
	X Problems detected in kubelet:
	W1218 15:38:05.782028    7817 out.go:239]   Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: W1218 23:33:36.405491    1332 reflector.go:535] object-"kubernetes-dashboard"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	  Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: W1218 23:33:36.405491    1332 reflector.go:535] object-"kubernetes-dashboard"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	W1218 15:38:05.782035    7817 out.go:239]   Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: E1218 23:33:36.405604    1332 reflector.go:147] object-"kubernetes-dashboard"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	  Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: E1218 23:33:36.405604    1332 reflector.go:147] object-"kubernetes-dashboard"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	I1218 15:38:05.782060    7817 out.go:309] Setting ErrFile to fd 2...
	I1218 15:38:05.782070    7817 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1218 15:38:15.783869    7817 api_server.go:253] Checking apiserver healthz at https://192.169.0.43:8443/healthz ...
	I1218 15:38:20.784680    7817 api_server.go:269] stopped: https://192.169.0.43:8443/healthz: Get "https://192.169.0.43:8443/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I1218 15:38:20.784884    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-apiserver --format={{.ID}}
	I1218 15:38:20.800325    7817 logs.go:284] 2 containers: [0835519b2271 d7a131de94f2]
	I1218 15:38:20.800406    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_etcd --format={{.ID}}
	I1218 15:38:20.813583    7817 logs.go:284] 2 containers: [f705bd0f78ae c1c34392f2c4]
	I1218 15:38:20.813666    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_coredns --format={{.ID}}
	I1218 15:38:20.827024    7817 logs.go:284] 2 containers: [7786b16f143d e5bd7d5d2abc]
	I1218 15:38:20.827101    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-scheduler --format={{.ID}}
	I1218 15:38:20.841066    7817 logs.go:284] 2 containers: [e919d704eb71 8ed2d9614b64]
	I1218 15:38:20.841138    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-proxy --format={{.ID}}
	I1218 15:38:20.854820    7817 logs.go:284] 2 containers: [3464ef616d4c c5f46b946942]
	I1218 15:38:20.854893    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-controller-manager --format={{.ID}}
	I1218 15:38:20.868539    7817 logs.go:284] 2 containers: [dcc04a5cd29c c4ac6546cfdf]
	I1218 15:38:20.868624    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kindnet --format={{.ID}}
	I1218 15:38:20.881174    7817 logs.go:284] 0 containers: []
	W1218 15:38:20.881186    7817 logs.go:286] No container was found matching "kindnet"
	I1218 15:38:20.881255    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kubernetes-dashboard --format={{.ID}}
	I1218 15:38:20.894582    7817 logs.go:284] 1 containers: [031281b5232b]
	I1218 15:38:20.894657    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_storage-provisioner --format={{.ID}}
	I1218 15:38:20.911089    7817 logs.go:284] 2 containers: [3c3c3d79bae5 5f94eed8fe93]
	I1218 15:38:20.911105    7817 logs.go:123] Gathering logs for describe nodes ...
	I1218 15:38:20.911115    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I1218 15:38:20.996470    7817 logs.go:123] Gathering logs for kube-scheduler [8ed2d9614b64] ...
	I1218 15:38:20.996484    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 8ed2d9614b64"
	I1218 15:38:21.018400    7817 logs.go:123] Gathering logs for container status ...
	I1218 15:38:21.018414    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 15:38:21.062927    7817 logs.go:123] Gathering logs for kubernetes-dashboard [031281b5232b] ...
	I1218 15:38:21.062943    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 031281b5232b"
	I1218 15:38:21.080056    7817 logs.go:123] Gathering logs for storage-provisioner [3c3c3d79bae5] ...
	I1218 15:38:21.080069    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3c3c3d79bae5"
	I1218 15:38:21.094229    7817 logs.go:123] Gathering logs for etcd [f705bd0f78ae] ...
	I1218 15:38:21.094244    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 f705bd0f78ae"
	I1218 15:38:21.112693    7817 logs.go:123] Gathering logs for coredns [7786b16f143d] ...
	I1218 15:38:21.112708    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 7786b16f143d"
	I1218 15:38:21.128062    7817 logs.go:123] Gathering logs for kube-scheduler [e919d704eb71] ...
	I1218 15:38:21.128076    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 e919d704eb71"
	I1218 15:38:21.144504    7817 logs.go:123] Gathering logs for kube-proxy [c5f46b946942] ...
	I1218 15:38:21.144518    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 c5f46b946942"
	I1218 15:38:21.160276    7817 logs.go:123] Gathering logs for kubelet ...
	I1218 15:38:21.160289    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	W1218 15:38:21.188350    7817 logs.go:138] Found kubelet problem: Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: W1218 23:33:36.405491    1332 reflector.go:535] object-"kubernetes-dashboard"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	W1218 15:38:21.188469    7817 logs.go:138] Found kubelet problem: Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: E1218 23:33:36.405604    1332 reflector.go:147] object-"kubernetes-dashboard"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	I1218 15:38:21.204331    7817 logs.go:123] Gathering logs for dmesg ...
	I1218 15:38:21.204343    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 15:38:21.213265    7817 logs.go:123] Gathering logs for kube-apiserver [d7a131de94f2] ...
	I1218 15:38:21.213276    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 d7a131de94f2"
	I1218 15:38:21.240986    7817 logs.go:123] Gathering logs for Docker ...
	I1218 15:38:21.240999    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u docker -u cri-docker -n 400"
	I1218 15:38:21.279660    7817 logs.go:123] Gathering logs for kube-controller-manager [dcc04a5cd29c] ...
	I1218 15:38:21.279673    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 dcc04a5cd29c"
	I1218 15:38:21.311901    7817 logs.go:123] Gathering logs for kube-controller-manager [c4ac6546cfdf] ...
	I1218 15:38:21.311915    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 c4ac6546cfdf"
	I1218 15:38:21.337798    7817 logs.go:123] Gathering logs for storage-provisioner [5f94eed8fe93] ...
	I1218 15:38:21.337813    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 5f94eed8fe93"
	I1218 15:38:21.352713    7817 logs.go:123] Gathering logs for kube-apiserver [0835519b2271] ...
	I1218 15:38:21.352727    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 0835519b2271"
	I1218 15:38:21.373090    7817 logs.go:123] Gathering logs for etcd [c1c34392f2c4] ...
	I1218 15:38:21.373105    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 c1c34392f2c4"
	I1218 15:38:21.397256    7817 logs.go:123] Gathering logs for coredns [e5bd7d5d2abc] ...
	I1218 15:38:21.397270    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 e5bd7d5d2abc"
	I1218 15:38:21.418626    7817 logs.go:123] Gathering logs for kube-proxy [3464ef616d4c] ...
	I1218 15:38:21.418640    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3464ef616d4c"
	I1218 15:38:21.434341    7817 out.go:309] Setting ErrFile to fd 2...
	I1218 15:38:21.434355    7817 out.go:343] TERM=,COLORTERM=, which probably does not support color
	W1218 15:38:21.434385    7817 out.go:239] X Problems detected in kubelet:
	X Problems detected in kubelet:
	W1218 15:38:21.434392    7817 out.go:239]   Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: W1218 23:33:36.405491    1332 reflector.go:535] object-"kubernetes-dashboard"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	  Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: W1218 23:33:36.405491    1332 reflector.go:535] object-"kubernetes-dashboard"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	W1218 15:38:21.434398    7817 out.go:239]   Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: E1218 23:33:36.405604    1332 reflector.go:147] object-"kubernetes-dashboard"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	  Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: E1218 23:33:36.405604    1332 reflector.go:147] object-"kubernetes-dashboard"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	I1218 15:38:21.434404    7817 out.go:309] Setting ErrFile to fd 2...
	I1218 15:38:21.434409    7817 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1218 15:38:31.436625    7817 api_server.go:253] Checking apiserver healthz at https://192.169.0.43:8443/healthz ...
	I1218 15:38:36.437642    7817 api_server.go:269] stopped: https://192.169.0.43:8443/healthz: Get "https://192.169.0.43:8443/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I1218 15:38:36.437823    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-apiserver --format={{.ID}}
	I1218 15:38:36.452434    7817 logs.go:284] 2 containers: [0835519b2271 d7a131de94f2]
	I1218 15:38:36.452507    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_etcd --format={{.ID}}
	I1218 15:38:36.469137    7817 logs.go:284] 2 containers: [f705bd0f78ae c1c34392f2c4]
	I1218 15:38:36.469215    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_coredns --format={{.ID}}
	I1218 15:38:36.482905    7817 logs.go:284] 2 containers: [7786b16f143d e5bd7d5d2abc]
	I1218 15:38:36.482983    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-scheduler --format={{.ID}}
	I1218 15:38:36.496434    7817 logs.go:284] 2 containers: [e919d704eb71 8ed2d9614b64]
	I1218 15:38:36.496514    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-proxy --format={{.ID}}
	I1218 15:38:36.510783    7817 logs.go:284] 2 containers: [3464ef616d4c c5f46b946942]
	I1218 15:38:36.510862    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-controller-manager --format={{.ID}}
	I1218 15:38:36.524585    7817 logs.go:284] 2 containers: [dcc04a5cd29c c4ac6546cfdf]
	I1218 15:38:36.524665    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kindnet --format={{.ID}}
	I1218 15:38:36.537477    7817 logs.go:284] 0 containers: []
	W1218 15:38:36.537489    7817 logs.go:286] No container was found matching "kindnet"
	I1218 15:38:36.537553    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_storage-provisioner --format={{.ID}}
	I1218 15:38:36.550968    7817 logs.go:284] 2 containers: [3c3c3d79bae5 5f94eed8fe93]
	I1218 15:38:36.551046    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kubernetes-dashboard --format={{.ID}}
	I1218 15:38:36.563922    7817 logs.go:284] 1 containers: [031281b5232b]
	I1218 15:38:36.563938    7817 logs.go:123] Gathering logs for kube-proxy [c5f46b946942] ...
	I1218 15:38:36.563946    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 c5f46b946942"
	I1218 15:38:36.579843    7817 logs.go:123] Gathering logs for kube-apiserver [0835519b2271] ...
	I1218 15:38:36.579857    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 0835519b2271"
	I1218 15:38:36.602285    7817 logs.go:123] Gathering logs for etcd [f705bd0f78ae] ...
	I1218 15:38:36.602300    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 f705bd0f78ae"
	I1218 15:38:36.621205    7817 logs.go:123] Gathering logs for coredns [7786b16f143d] ...
	I1218 15:38:36.621220    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 7786b16f143d"
	I1218 15:38:36.636252    7817 logs.go:123] Gathering logs for kube-scheduler [8ed2d9614b64] ...
	I1218 15:38:36.636265    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 8ed2d9614b64"
	I1218 15:38:36.658876    7817 logs.go:123] Gathering logs for kube-proxy [3464ef616d4c] ...
	I1218 15:38:36.658892    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3464ef616d4c"
	I1218 15:38:36.673876    7817 logs.go:123] Gathering logs for coredns [e5bd7d5d2abc] ...
	I1218 15:38:36.673891    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 e5bd7d5d2abc"
	I1218 15:38:36.689828    7817 logs.go:123] Gathering logs for storage-provisioner [3c3c3d79bae5] ...
	I1218 15:38:36.689843    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3c3c3d79bae5"
	I1218 15:38:36.705297    7817 logs.go:123] Gathering logs for storage-provisioner [5f94eed8fe93] ...
	I1218 15:38:36.705310    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 5f94eed8fe93"
	I1218 15:38:36.725019    7817 logs.go:123] Gathering logs for container status ...
	I1218 15:38:36.725034    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 15:38:36.769738    7817 logs.go:123] Gathering logs for dmesg ...
	I1218 15:38:36.769753    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 15:38:36.779785    7817 logs.go:123] Gathering logs for etcd [c1c34392f2c4] ...
	I1218 15:38:36.779798    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 c1c34392f2c4"
	I1218 15:38:36.799867    7817 logs.go:123] Gathering logs for kube-scheduler [e919d704eb71] ...
	I1218 15:38:36.799881    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 e919d704eb71"
	I1218 15:38:36.814695    7817 logs.go:123] Gathering logs for kube-controller-manager [dcc04a5cd29c] ...
	I1218 15:38:36.814710    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 dcc04a5cd29c"
	I1218 15:38:36.841526    7817 logs.go:123] Gathering logs for kube-controller-manager [c4ac6546cfdf] ...
	I1218 15:38:36.841539    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 c4ac6546cfdf"
	I1218 15:38:36.874299    7817 logs.go:123] Gathering logs for kubelet ...
	I1218 15:38:36.874313    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	W1218 15:38:36.903401    7817 logs.go:138] Found kubelet problem: Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: W1218 23:33:36.405491    1332 reflector.go:535] object-"kubernetes-dashboard"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	W1218 15:38:36.903532    7817 logs.go:138] Found kubelet problem: Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: E1218 23:33:36.405604    1332 reflector.go:147] object-"kubernetes-dashboard"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	I1218 15:38:36.921578    7817 logs.go:123] Gathering logs for describe nodes ...
	I1218 15:38:36.921588    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I1218 15:38:37.006115    7817 logs.go:123] Gathering logs for kube-apiserver [d7a131de94f2] ...
	I1218 15:38:37.006128    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 d7a131de94f2"
	I1218 15:38:37.033973    7817 logs.go:123] Gathering logs for kubernetes-dashboard [031281b5232b] ...
	I1218 15:38:37.033987    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 031281b5232b"
	I1218 15:38:37.050489    7817 logs.go:123] Gathering logs for Docker ...
	I1218 15:38:37.050503    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u docker -u cri-docker -n 400"
	I1218 15:38:37.088959    7817 out.go:309] Setting ErrFile to fd 2...
	I1218 15:38:37.088973    7817 out.go:343] TERM=,COLORTERM=, which probably does not support color
	W1218 15:38:37.089003    7817 out.go:239] X Problems detected in kubelet:
	X Problems detected in kubelet:
	W1218 15:38:37.089009    7817 out.go:239]   Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: W1218 23:33:36.405491    1332 reflector.go:535] object-"kubernetes-dashboard"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	  Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: W1218 23:33:36.405491    1332 reflector.go:535] object-"kubernetes-dashboard"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	W1218 15:38:37.089014    7817 out.go:239]   Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: E1218 23:33:36.405604    1332 reflector.go:147] object-"kubernetes-dashboard"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	  Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: E1218 23:33:36.405604    1332 reflector.go:147] object-"kubernetes-dashboard"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	I1218 15:38:37.089019    7817 out.go:309] Setting ErrFile to fd 2...
	I1218 15:38:37.089024    7817 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1218 15:38:47.090002    7817 api_server.go:253] Checking apiserver healthz at https://192.169.0.43:8443/healthz ...
	I1218 15:38:52.091784    7817 api_server.go:269] stopped: https://192.169.0.43:8443/healthz: Get "https://192.169.0.43:8443/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I1218 15:38:52.092005    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-apiserver --format={{.ID}}
	I1218 15:38:52.106504    7817 logs.go:284] 2 containers: [0835519b2271 d7a131de94f2]
	I1218 15:38:52.106585    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_etcd --format={{.ID}}
	I1218 15:38:52.120882    7817 logs.go:284] 2 containers: [f705bd0f78ae c1c34392f2c4]
	I1218 15:38:52.120957    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_coredns --format={{.ID}}
	I1218 15:38:52.135448    7817 logs.go:284] 2 containers: [7786b16f143d e5bd7d5d2abc]
	I1218 15:38:52.135524    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-scheduler --format={{.ID}}
	I1218 15:38:52.149182    7817 logs.go:284] 2 containers: [e919d704eb71 8ed2d9614b64]
	I1218 15:38:52.149259    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-proxy --format={{.ID}}
	I1218 15:38:52.163054    7817 logs.go:284] 2 containers: [3464ef616d4c c5f46b946942]
	I1218 15:38:52.163131    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-controller-manager --format={{.ID}}
	I1218 15:38:52.176522    7817 logs.go:284] 2 containers: [dcc04a5cd29c c4ac6546cfdf]
	I1218 15:38:52.176594    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kindnet --format={{.ID}}
	I1218 15:38:52.193988    7817 logs.go:284] 0 containers: []
	W1218 15:38:52.194002    7817 logs.go:286] No container was found matching "kindnet"
	I1218 15:38:52.194065    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kubernetes-dashboard --format={{.ID}}
	I1218 15:38:52.208478    7817 logs.go:284] 1 containers: [031281b5232b]
	I1218 15:38:52.208559    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_storage-provisioner --format={{.ID}}
	I1218 15:38:52.222156    7817 logs.go:284] 2 containers: [3c3c3d79bae5 5f94eed8fe93]
	I1218 15:38:52.222176    7817 logs.go:123] Gathering logs for coredns [7786b16f143d] ...
	I1218 15:38:52.222186    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 7786b16f143d"
	I1218 15:38:52.236614    7817 logs.go:123] Gathering logs for storage-provisioner [3c3c3d79bae5] ...
	I1218 15:38:52.236629    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3c3c3d79bae5"
	I1218 15:38:52.251662    7817 logs.go:123] Gathering logs for container status ...
	I1218 15:38:52.251676    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 15:38:52.304876    7817 logs.go:123] Gathering logs for kubelet ...
	I1218 15:38:52.304892    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	W1218 15:38:52.334365    7817 logs.go:138] Found kubelet problem: Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: W1218 23:33:36.405491    1332 reflector.go:535] object-"kubernetes-dashboard"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	W1218 15:38:52.334493    7817 logs.go:138] Found kubelet problem: Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: E1218 23:33:36.405604    1332 reflector.go:147] object-"kubernetes-dashboard"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	I1218 15:38:52.351826    7817 logs.go:123] Gathering logs for dmesg ...
	I1218 15:38:52.351853    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 15:38:52.361931    7817 logs.go:123] Gathering logs for kube-apiserver [0835519b2271] ...
	I1218 15:38:52.361946    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 0835519b2271"
	I1218 15:38:52.383000    7817 logs.go:123] Gathering logs for etcd [c1c34392f2c4] ...
	I1218 15:38:52.383014    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 c1c34392f2c4"
	I1218 15:38:52.403256    7817 logs.go:123] Gathering logs for etcd [f705bd0f78ae] ...
	I1218 15:38:52.403271    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 f705bd0f78ae"
	I1218 15:38:52.421691    7817 logs.go:123] Gathering logs for kube-scheduler [e919d704eb71] ...
	I1218 15:38:52.421705    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 e919d704eb71"
	I1218 15:38:52.437151    7817 logs.go:123] Gathering logs for kube-scheduler [8ed2d9614b64] ...
	I1218 15:38:52.437165    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 8ed2d9614b64"
	I1218 15:38:52.457062    7817 logs.go:123] Gathering logs for kube-proxy [3464ef616d4c] ...
	I1218 15:38:52.457077    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3464ef616d4c"
	I1218 15:38:52.472112    7817 logs.go:123] Gathering logs for coredns [e5bd7d5d2abc] ...
	I1218 15:38:52.472126    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 e5bd7d5d2abc"
	I1218 15:38:52.487219    7817 logs.go:123] Gathering logs for kube-proxy [c5f46b946942] ...
	I1218 15:38:52.487233    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 c5f46b946942"
	I1218 15:38:52.503770    7817 logs.go:123] Gathering logs for kube-controller-manager [c4ac6546cfdf] ...
	I1218 15:38:52.503784    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 c4ac6546cfdf"
	I1218 15:38:52.528942    7817 logs.go:123] Gathering logs for Docker ...
	I1218 15:38:52.528955    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u docker -u cri-docker -n 400"
	I1218 15:38:52.566724    7817 logs.go:123] Gathering logs for storage-provisioner [5f94eed8fe93] ...
	I1218 15:38:52.566737    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 5f94eed8fe93"
	I1218 15:38:52.581686    7817 logs.go:123] Gathering logs for describe nodes ...
	I1218 15:38:52.581701    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I1218 15:38:52.671091    7817 logs.go:123] Gathering logs for kube-apiserver [d7a131de94f2] ...
	I1218 15:38:52.671107    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 d7a131de94f2"
	I1218 15:38:52.698520    7817 logs.go:123] Gathering logs for kube-controller-manager [dcc04a5cd29c] ...
	I1218 15:38:52.698534    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 dcc04a5cd29c"
	I1218 15:38:52.726312    7817 logs.go:123] Gathering logs for kubernetes-dashboard [031281b5232b] ...
	I1218 15:38:52.726327    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 031281b5232b"
	I1218 15:38:52.742317    7817 out.go:309] Setting ErrFile to fd 2...
	I1218 15:38:52.742330    7817 out.go:343] TERM=,COLORTERM=, which probably does not support color
	W1218 15:38:52.742359    7817 out.go:239] X Problems detected in kubelet:
	X Problems detected in kubelet:
	W1218 15:38:52.742365    7817 out.go:239]   Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: W1218 23:33:36.405491    1332 reflector.go:535] object-"kubernetes-dashboard"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	  Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: W1218 23:33:36.405491    1332 reflector.go:535] object-"kubernetes-dashboard"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	W1218 15:38:52.742378    7817 out.go:239]   Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: E1218 23:33:36.405604    1332 reflector.go:147] object-"kubernetes-dashboard"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	  Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: E1218 23:33:36.405604    1332 reflector.go:147] object-"kubernetes-dashboard"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	I1218 15:38:52.742386    7817 out.go:309] Setting ErrFile to fd 2...
	I1218 15:38:52.742392    7817 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1218 15:39:02.743172    7817 api_server.go:253] Checking apiserver healthz at https://192.169.0.43:8443/healthz ...
	I1218 15:39:07.744393    7817 api_server.go:269] stopped: https://192.169.0.43:8443/healthz: Get "https://192.169.0.43:8443/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I1218 15:39:07.744538    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-apiserver --format={{.ID}}
	I1218 15:39:07.758757    7817 logs.go:284] 2 containers: [0835519b2271 d7a131de94f2]
	I1218 15:39:07.758832    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_etcd --format={{.ID}}
	I1218 15:39:07.772062    7817 logs.go:284] 2 containers: [f705bd0f78ae c1c34392f2c4]
	I1218 15:39:07.772141    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_coredns --format={{.ID}}
	I1218 15:39:07.785274    7817 logs.go:284] 2 containers: [7786b16f143d e5bd7d5d2abc]
	I1218 15:39:07.785350    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-scheduler --format={{.ID}}
	I1218 15:39:07.799388    7817 logs.go:284] 2 containers: [e919d704eb71 8ed2d9614b64]
	I1218 15:39:07.799461    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-proxy --format={{.ID}}
	I1218 15:39:07.815487    7817 logs.go:284] 2 containers: [3464ef616d4c c5f46b946942]
	I1218 15:39:07.815571    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-controller-manager --format={{.ID}}
	I1218 15:39:07.829459    7817 logs.go:284] 2 containers: [dcc04a5cd29c c4ac6546cfdf]
	I1218 15:39:07.829540    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kindnet --format={{.ID}}
	I1218 15:39:07.843416    7817 logs.go:284] 0 containers: []
	W1218 15:39:07.843430    7817 logs.go:286] No container was found matching "kindnet"
	I1218 15:39:07.843497    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_storage-provisioner --format={{.ID}}
	I1218 15:39:07.856988    7817 logs.go:284] 2 containers: [3c3c3d79bae5 5f94eed8fe93]
	I1218 15:39:07.857063    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kubernetes-dashboard --format={{.ID}}
	I1218 15:39:07.870758    7817 logs.go:284] 1 containers: [031281b5232b]
	I1218 15:39:07.870774    7817 logs.go:123] Gathering logs for describe nodes ...
	I1218 15:39:07.870782    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I1218 15:39:07.957542    7817 logs.go:123] Gathering logs for coredns [e5bd7d5d2abc] ...
	I1218 15:39:07.957557    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 e5bd7d5d2abc"
	I1218 15:39:07.972634    7817 logs.go:123] Gathering logs for kube-proxy [3464ef616d4c] ...
	I1218 15:39:07.972648    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3464ef616d4c"
	I1218 15:39:07.987522    7817 logs.go:123] Gathering logs for kube-proxy [c5f46b946942] ...
	I1218 15:39:07.987536    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 c5f46b946942"
	I1218 15:39:08.002626    7817 logs.go:123] Gathering logs for dmesg ...
	I1218 15:39:08.002639    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 15:39:08.012761    7817 logs.go:123] Gathering logs for etcd [f705bd0f78ae] ...
	I1218 15:39:08.012773    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 f705bd0f78ae"
	I1218 15:39:08.031840    7817 logs.go:123] Gathering logs for etcd [c1c34392f2c4] ...
	I1218 15:39:08.031853    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 c1c34392f2c4"
	I1218 15:39:08.050703    7817 logs.go:123] Gathering logs for kube-scheduler [e919d704eb71] ...
	I1218 15:39:08.050716    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 e919d704eb71"
	I1218 15:39:08.066709    7817 logs.go:123] Gathering logs for storage-provisioner [5f94eed8fe93] ...
	I1218 15:39:08.066724    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 5f94eed8fe93"
	I1218 15:39:08.081123    7817 logs.go:123] Gathering logs for kubelet ...
	I1218 15:39:08.081137    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	W1218 15:39:08.109736    7817 logs.go:138] Found kubelet problem: Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: W1218 23:33:36.405491    1332 reflector.go:535] object-"kubernetes-dashboard"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	W1218 15:39:08.109863    7817 logs.go:138] Found kubelet problem: Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: E1218 23:33:36.405604    1332 reflector.go:147] object-"kubernetes-dashboard"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	I1218 15:39:08.127501    7817 logs.go:123] Gathering logs for kube-apiserver [0835519b2271] ...
	I1218 15:39:08.127510    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 0835519b2271"
	I1218 15:39:08.150550    7817 logs.go:123] Gathering logs for kube-apiserver [d7a131de94f2] ...
	I1218 15:39:08.150562    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 d7a131de94f2"
	I1218 15:39:08.176830    7817 logs.go:123] Gathering logs for kube-controller-manager [dcc04a5cd29c] ...
	I1218 15:39:08.176844    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 dcc04a5cd29c"
	I1218 15:39:08.203674    7817 logs.go:123] Gathering logs for kubernetes-dashboard [031281b5232b] ...
	I1218 15:39:08.203687    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 031281b5232b"
	I1218 15:39:08.219015    7817 logs.go:123] Gathering logs for Docker ...
	I1218 15:39:08.219030    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u docker -u cri-docker -n 400"
	I1218 15:39:08.257422    7817 logs.go:123] Gathering logs for container status ...
	I1218 15:39:08.257435    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 15:39:08.309959    7817 logs.go:123] Gathering logs for coredns [7786b16f143d] ...
	I1218 15:39:08.309973    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 7786b16f143d"
	I1218 15:39:08.324558    7817 logs.go:123] Gathering logs for kube-scheduler [8ed2d9614b64] ...
	I1218 15:39:08.324573    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 8ed2d9614b64"
	I1218 15:39:08.347888    7817 logs.go:123] Gathering logs for kube-controller-manager [c4ac6546cfdf] ...
	I1218 15:39:08.347902    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 c4ac6546cfdf"
	I1218 15:39:08.379731    7817 logs.go:123] Gathering logs for storage-provisioner [3c3c3d79bae5] ...
	I1218 15:39:08.379746    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3c3c3d79bae5"
	I1218 15:39:08.397722    7817 out.go:309] Setting ErrFile to fd 2...
	I1218 15:39:08.397735    7817 out.go:343] TERM=,COLORTERM=, which probably does not support color
	W1218 15:39:08.397765    7817 out.go:239] X Problems detected in kubelet:
	X Problems detected in kubelet:
	W1218 15:39:08.397771    7817 out.go:239]   Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: W1218 23:33:36.405491    1332 reflector.go:535] object-"kubernetes-dashboard"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	  Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: W1218 23:33:36.405491    1332 reflector.go:535] object-"kubernetes-dashboard"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	W1218 15:39:08.397776    7817 out.go:239]   Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: E1218 23:33:36.405604    1332 reflector.go:147] object-"kubernetes-dashboard"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	  Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: E1218 23:33:36.405604    1332 reflector.go:147] object-"kubernetes-dashboard"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	I1218 15:39:08.397781    7817 out.go:309] Setting ErrFile to fd 2...
	I1218 15:39:08.397787    7817 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1218 15:39:18.400888    7817 api_server.go:253] Checking apiserver healthz at https://192.169.0.43:8443/healthz ...
	I1218 15:39:23.402279    7817 api_server.go:269] stopped: https://192.169.0.43:8443/healthz: Get "https://192.169.0.43:8443/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I1218 15:39:23.402398    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-apiserver --format={{.ID}}
	I1218 15:39:23.415705    7817 logs.go:284] 2 containers: [0835519b2271 d7a131de94f2]
	I1218 15:39:23.415780    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_etcd --format={{.ID}}
	I1218 15:39:23.428843    7817 logs.go:284] 2 containers: [f705bd0f78ae c1c34392f2c4]
	I1218 15:39:23.428918    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_coredns --format={{.ID}}
	I1218 15:39:23.442138    7817 logs.go:284] 2 containers: [7786b16f143d e5bd7d5d2abc]
	I1218 15:39:23.442214    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-scheduler --format={{.ID}}
	I1218 15:39:23.454637    7817 logs.go:284] 2 containers: [e919d704eb71 8ed2d9614b64]
	I1218 15:39:23.454710    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-proxy --format={{.ID}}
	I1218 15:39:23.467677    7817 logs.go:284] 2 containers: [3464ef616d4c c5f46b946942]
	I1218 15:39:23.467748    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kube-controller-manager --format={{.ID}}
	I1218 15:39:23.480856    7817 logs.go:284] 2 containers: [dcc04a5cd29c c4ac6546cfdf]
	I1218 15:39:23.480928    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kindnet --format={{.ID}}
	I1218 15:39:23.493170    7817 logs.go:284] 0 containers: []
	W1218 15:39:23.493182    7817 logs.go:286] No container was found matching "kindnet"
	I1218 15:39:23.493251    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_kubernetes-dashboard --format={{.ID}}
	I1218 15:39:23.506780    7817 logs.go:284] 1 containers: [031281b5232b]
	I1218 15:39:23.506852    7817 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_storage-provisioner --format={{.ID}}
	I1218 15:39:23.519746    7817 logs.go:284] 2 containers: [3c3c3d79bae5 5f94eed8fe93]
	I1218 15:39:23.519765    7817 logs.go:123] Gathering logs for coredns [e5bd7d5d2abc] ...
	I1218 15:39:23.519773    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 e5bd7d5d2abc"
	I1218 15:39:23.534543    7817 logs.go:123] Gathering logs for kube-scheduler [8ed2d9614b64] ...
	I1218 15:39:23.534556    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 8ed2d9614b64"
	I1218 15:39:23.555083    7817 logs.go:123] Gathering logs for kube-proxy [c5f46b946942] ...
	I1218 15:39:23.555097    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 c5f46b946942"
	I1218 15:39:23.570634    7817 logs.go:123] Gathering logs for kube-apiserver [d7a131de94f2] ...
	I1218 15:39:23.570648    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 d7a131de94f2"
	I1218 15:39:23.596651    7817 logs.go:123] Gathering logs for etcd [f705bd0f78ae] ...
	I1218 15:39:23.596664    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 f705bd0f78ae"
	I1218 15:39:23.615497    7817 logs.go:123] Gathering logs for etcd [c1c34392f2c4] ...
	I1218 15:39:23.615511    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 c1c34392f2c4"
	I1218 15:39:23.637716    7817 logs.go:123] Gathering logs for kube-scheduler [e919d704eb71] ...
	I1218 15:39:23.637729    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 e919d704eb71"
	I1218 15:39:23.653966    7817 logs.go:123] Gathering logs for kube-controller-manager [c4ac6546cfdf] ...
	I1218 15:39:23.653980    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 c4ac6546cfdf"
	I1218 15:39:23.685685    7817 logs.go:123] Gathering logs for storage-provisioner [5f94eed8fe93] ...
	I1218 15:39:23.685699    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 5f94eed8fe93"
	I1218 15:39:23.700872    7817 logs.go:123] Gathering logs for Docker ...
	I1218 15:39:23.700886    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u docker -u cri-docker -n 400"
	I1218 15:39:23.739938    7817 logs.go:123] Gathering logs for dmesg ...
	I1218 15:39:23.739952    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 15:39:23.749725    7817 logs.go:123] Gathering logs for kube-apiserver [0835519b2271] ...
	I1218 15:39:23.749739    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 0835519b2271"
	I1218 15:39:23.773774    7817 logs.go:123] Gathering logs for coredns [7786b16f143d] ...
	I1218 15:39:23.773789    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 7786b16f143d"
	I1218 15:39:23.804508    7817 logs.go:123] Gathering logs for kube-proxy [3464ef616d4c] ...
	I1218 15:39:23.804523    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3464ef616d4c"
	I1218 15:39:23.832582    7817 logs.go:123] Gathering logs for storage-provisioner [3c3c3d79bae5] ...
	I1218 15:39:23.832597    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 3c3c3d79bae5"
	I1218 15:39:23.863148    7817 logs.go:123] Gathering logs for container status ...
	I1218 15:39:23.863163    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 15:39:23.961025    7817 logs.go:123] Gathering logs for kubelet ...
	I1218 15:39:23.961038    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	W1218 15:39:23.990042    7817 logs.go:138] Found kubelet problem: Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: W1218 23:33:36.405491    1332 reflector.go:535] object-"kubernetes-dashboard"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	W1218 15:39:23.990169    7817 logs.go:138] Found kubelet problem: Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: E1218 23:33:36.405604    1332 reflector.go:147] object-"kubernetes-dashboard"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	I1218 15:39:24.009073    7817 logs.go:123] Gathering logs for describe nodes ...
	I1218 15:39:24.009081    7817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I1218 15:39:24.087970    7817 logs.go:123] Gathering logs for kube-controller-manager [dcc04a5cd29c] ...
	I1218 15:39:24.087986    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 dcc04a5cd29c"
	I1218 15:39:24.115831    7817 logs.go:123] Gathering logs for kubernetes-dashboard [031281b5232b] ...
	I1218 15:39:24.115845    7817 ssh_runner.go:195] Run: /bin/bash -c "docker logs --tail 400 031281b5232b"
	I1218 15:39:24.134275    7817 out.go:309] Setting ErrFile to fd 2...
	I1218 15:39:24.134288    7817 out.go:343] TERM=,COLORTERM=, which probably does not support color
	W1218 15:39:24.134318    7817 out.go:239] X Problems detected in kubelet:
	X Problems detected in kubelet:
	W1218 15:39:24.134324    7817 out.go:239]   Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: W1218 23:33:36.405491    1332 reflector.go:535] object-"kubernetes-dashboard"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	  Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: W1218 23:33:36.405491    1332 reflector.go:535] object-"kubernetes-dashboard"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	W1218 15:39:24.134330    7817 out.go:239]   Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: E1218 23:33:36.405604    1332 reflector.go:147] object-"kubernetes-dashboard"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	  Dec 18 23:33:36 embed-certs-732000 kubelet[1332]: E1218 23:33:36.405604    1332 reflector.go:147] object-"kubernetes-dashboard"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:embed-certs-732000" cannot list resource "configmaps" in API group "" in the namespace "kubernetes-dashboard": no relationship found between node 'embed-certs-732000' and this object
	I1218 15:39:24.134337    7817 out.go:309] Setting ErrFile to fd 2...
	I1218 15:39:24.134341    7817 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1218 15:39:34.137016    7817 api_server.go:253] Checking apiserver healthz at https://192.169.0.43:8443/healthz ...
	I1218 15:39:39.137695    7817 api_server.go:269] stopped: https://192.169.0.43:8443/healthz: Get "https://192.169.0.43:8443/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I1218 15:39:39.160062    7817 out.go:177] 
	W1218 15:39:39.180695    7817 out.go:239] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: wait for healthy API server: apiserver healthz never reported healthy: cluster wait timed out during healthz check
	X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: wait for healthy API server: apiserver healthz never reported healthy: cluster wait timed out during healthz check
	W1218 15:39:39.180708    7817 out.go:239] * 
	* 
	W1218 15:39:39.181491    7817 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 15:39:39.243857    7817 out.go:177] 

                                                
                                                
** /stderr **
start_stop_delete_test.go:259: failed to start minikube post-stop. args "out/minikube-darwin-amd64 start -p embed-certs-732000 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=hyperkit  --kubernetes-version=v1.28.4": exit status 80
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p embed-certs-732000 -n embed-certs-732000
E1218 15:39:51.461764    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/flannel-302000/client.crt: no such file or directory
E1218 15:39:55.599272    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/bridge-302000/client.crt: no such file or directory
E1218 15:39:57.371413    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
E1218 15:40:11.370551    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/auto-302000/client.crt: no such file or directory
E1218 15:40:11.858161    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
E1218 15:40:27.005442    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kindnet-302000/client.crt: no such file or directory
E1218 15:40:28.804956    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
E1218 15:40:51.447344    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/old-k8s-version-920000/client.crt: no such file or directory
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p embed-certs-732000 -n embed-certs-732000: exit status 3 (1m15.089271387s)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E1218 15:40:54.435088    8085 status.go:376] failed to get storage capacity of /var: NewSession: new client: new client: dial tcp 192.169.0.43:22: connect: operation timed out
	E1218 15:40:54.435104    8085 status.go:249] status error: NewSession: new client: new client: dial tcp 192.169.0.43:22: connect: operation timed out

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 3 (may be ok)
helpers_test.go:241: "embed-certs-732000" host is not running, skipping log retrieval (state="Error")
--- FAIL: TestStartStop/group/embed-certs/serial/SecondStart (484.34s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/FirstStart (802.04s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p default-k8s-diff-port-748000 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.28.4
E1218 15:37:19.414901    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kubenet-302000/client.crt: no such file or directory
E1218 15:37:44.212895    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/no-preload-994000/client.crt: no such file or directory
E1218 15:37:57.877710    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
E1218 15:38:07.593796    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/old-k8s-version-920000/client.crt: no such file or directory
E1218 15:38:07.600240    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/old-k8s-version-920000/client.crt: no such file or directory
E1218 15:38:07.611737    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/old-k8s-version-920000/client.crt: no such file or directory
E1218 15:38:07.632075    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/old-k8s-version-920000/client.crt: no such file or directory
E1218 15:38:07.672559    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/old-k8s-version-920000/client.crt: no such file or directory
E1218 15:38:07.753538    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/old-k8s-version-920000/client.crt: no such file or directory
E1218 15:38:07.914719    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/old-k8s-version-920000/client.crt: no such file or directory
E1218 15:38:08.235304    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/old-k8s-version-920000/client.crt: no such file or directory
E1218 15:38:08.877281    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/old-k8s-version-920000/client.crt: no such file or directory
E1218 15:38:10.158007    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/old-k8s-version-920000/client.crt: no such file or directory
E1218 15:38:12.720021    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/old-k8s-version-920000/client.crt: no such file or directory
E1218 15:38:17.840264    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/old-k8s-version-920000/client.crt: no such file or directory
E1218 15:38:22.755618    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/false-302000/client.crt: no such file or directory
E1218 15:38:28.080835    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/old-k8s-version-920000/client.crt: no such file or directory
E1218 15:38:36.111330    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/enable-default-cni-302000/client.crt: no such file or directory
E1218 15:38:39.135368    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
E1218 15:38:48.562406    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/old-k8s-version-920000/client.crt: no such file or directory
E1218 15:39:06.134857    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/no-preload-994000/client.crt: no such file or directory
E1218 15:39:29.524613    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/old-k8s-version-920000/client.crt: no such file or directory
start_stop_delete_test.go:186: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p default-k8s-diff-port-748000 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.28.4: exit status 52 (12m6.930192577s)

                                                
                                                
-- stdout --
	* [default-k8s-diff-port-748000] minikube v1.32.0 on Darwin 14.2
	  - MINIKUBE_LOCATION=17822
	  - KUBECONFIG=/Users/jenkins/minikube-integration/17822-999/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17822-999/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on user configuration
	* Starting control plane node default-k8s-diff-port-748000 in cluster default-k8s-diff-port-748000
	* Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	* Deleting "default-k8s-diff-port-748000" in hyperkit ...
	* Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 15:37:06.052114    8029 out.go:296] Setting OutFile to fd 1 ...
	I1218 15:37:06.052405    8029 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1218 15:37:06.052410    8029 out.go:309] Setting ErrFile to fd 2...
	I1218 15:37:06.052415    8029 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1218 15:37:06.052599    8029 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17822-999/.minikube/bin
	I1218 15:37:06.054102    8029 out.go:303] Setting JSON to false
	I1218 15:37:06.076847    8029 start.go:128] hostinfo: {"hostname":"MacOS-Agent-2.local","uptime":3997,"bootTime":1702938629,"procs":457,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.2","kernelVersion":"23.2.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"2965c349-98a5-5970-aaa9-9eedd3ae5959"}
	W1218 15:37:06.076941    8029 start.go:136] gopshost.Virtualization returned error: not implemented yet
	I1218 15:37:06.098068    8029 out.go:177] * [default-k8s-diff-port-748000] minikube v1.32.0 on Darwin 14.2
	I1218 15:37:06.139993    8029 out.go:177]   - MINIKUBE_LOCATION=17822
	I1218 15:37:06.140009    8029 notify.go:220] Checking for updates...
	I1218 15:37:06.182879    8029 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/17822-999/kubeconfig
	I1218 15:37:06.245860    8029 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I1218 15:37:06.308857    8029 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 15:37:06.329849    8029 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17822-999/.minikube
	I1218 15:37:06.370879    8029 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 15:37:06.392813    8029 config.go:182] Loaded profile config "embed-certs-732000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
	I1218 15:37:06.392981    8029 driver.go:392] Setting default libvirt URI to qemu:///system
	I1218 15:37:06.421843    8029 out.go:177] * Using the hyperkit driver based on user configuration
	I1218 15:37:06.442878    8029 start.go:298] selected driver: hyperkit
	I1218 15:37:06.442903    8029 start.go:902] validating driver "hyperkit" against <nil>
	I1218 15:37:06.442915    8029 start.go:913] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 15:37:06.446427    8029 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 15:37:06.446530    8029 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/17822-999/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I1218 15:37:06.454441    8029 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.32.0
	I1218 15:37:06.458330    8029 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 15:37:06.458351    8029 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I1218 15:37:06.458385    8029 start_flags.go:309] no existing cluster config was found, will generate one from the flags 
	I1218 15:37:06.458578    8029 start_flags.go:931] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1218 15:37:06.458615    8029 cni.go:84] Creating CNI manager for ""
	I1218 15:37:06.458628    8029 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I1218 15:37:06.458637    8029 start_flags.go:318] Found "bridge CNI" CNI - setting NetworkPlugin=cni
	I1218 15:37:06.458643    8029 start_flags.go:323] config:
	{Name:default-k8s-diff-port-748000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1702920864-17822@sha256:4842b362f06b33d847d73f7ed166c93ce608f4c4cea49b711c7055fd50ebd1e0 Memory:2200 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.4 ClusterName:default-k8s-diff-port-748000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster
.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8444 NodeName:} Nodes:[] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 AutoPauseInterval:1m0s GPUs:}
	I1218 15:37:06.458782    8029 iso.go:125] acquiring lock: {Name:mk6c2133f2dd3312b15d4fc195383881e10096e9 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 15:37:06.500732    8029 out.go:177] * Starting control plane node default-k8s-diff-port-748000 in cluster default-k8s-diff-port-748000
	I1218 15:37:06.522980    8029 preload.go:132] Checking if preload exists for k8s version v1.28.4 and runtime docker
	I1218 15:37:06.523021    8029 preload.go:148] Found local preload: /Users/jenkins/minikube-integration/17822-999/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-docker-overlay2-amd64.tar.lz4
	I1218 15:37:06.523037    8029 cache.go:56] Caching tarball of preloaded images
	I1218 15:37:06.523145    8029 preload.go:174] Found /Users/jenkins/minikube-integration/17822-999/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I1218 15:37:06.523154    8029 cache.go:59] Finished verifying existence of preloaded tar for  v1.28.4 on docker
	I1218 15:37:06.523236    8029 profile.go:148] Saving config to /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/default-k8s-diff-port-748000/config.json ...
	I1218 15:37:06.523255    8029 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/default-k8s-diff-port-748000/config.json: {Name:mk2eb79b144c398660b0c04a9a8cf11b07ac786d Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 15:37:06.523570    8029 start.go:365] acquiring machines lock for default-k8s-diff-port-748000: {Name:mk129da0b7e14236047c6f70b7fc622a9cc1d994 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I1218 15:37:06.523621    8029 start.go:369] acquired machines lock for "default-k8s-diff-port-748000" in 37.231µs
	I1218 15:37:06.523646    8029 start.go:93] Provisioning new machine with config: &{Name:default-k8s-diff-port-748000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/17806/minikube-v1.32.1-1702708929-17806-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1702920864-17822@sha256:4842b362f06b33d847d73f7ed166c93ce608f4c4cea49b711c7055fd50ebd1e0 Memory:2200 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:
22 KubernetesConfig:{KubernetesVersion:v1.28.4 ClusterName:default-k8s-diff-port-748000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8444 NodeName:} Nodes:[{Name: IP: Port:8444 KubernetesVersion:v1.28.4 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker Mo
untIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 AutoPauseInterval:1m0s GPUs:} &{Name: IP: Port:8444 KubernetesVersion:v1.28.4 ContainerRuntime:docker ControlPlane:true Worker:true}
	I1218 15:37:06.523687    8029 start.go:125] createHost starting for "" (driver="hyperkit")
	I1218 15:37:06.545072    8029 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I1218 15:37:06.545455    8029 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 15:37:06.545510    8029 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 15:37:06.554718    8029 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:56697
	I1218 15:37:06.555080    8029 main.go:141] libmachine: () Calling .GetVersion
	I1218 15:37:06.555478    8029 main.go:141] libmachine: Using API Version  1
	I1218 15:37:06.555489    8029 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 15:37:06.555737    8029 main.go:141] libmachine: () Calling .GetMachineName
	I1218 15:37:06.555833    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .GetMachineName
	I1218 15:37:06.555913    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .DriverName
	I1218 15:37:06.556017    8029 start.go:159] libmachine.API.Create for "default-k8s-diff-port-748000" (driver="hyperkit")
	I1218 15:37:06.556046    8029 client.go:168] LocalClient.Create starting
	I1218 15:37:06.556079    8029 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/17822-999/.minikube/certs/ca.pem
	I1218 15:37:06.556130    8029 main.go:141] libmachine: Decoding PEM data...
	I1218 15:37:06.556148    8029 main.go:141] libmachine: Parsing certificate...
	I1218 15:37:06.556215    8029 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/17822-999/.minikube/certs/cert.pem
	I1218 15:37:06.556251    8029 main.go:141] libmachine: Decoding PEM data...
	I1218 15:37:06.556262    8029 main.go:141] libmachine: Parsing certificate...
	I1218 15:37:06.556277    8029 main.go:141] libmachine: Running pre-create checks...
	I1218 15:37:06.556286    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .PreCreateCheck
	I1218 15:37:06.556388    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:37:06.556548    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .GetConfigRaw
	I1218 15:37:06.566442    8029 main.go:141] libmachine: Creating machine...
	I1218 15:37:06.566469    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .Create
	I1218 15:37:06.566699    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:37:06.567035    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | I1218 15:37:06.566673    8037 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/17822-999/.minikube
	I1218 15:37:06.567165    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Downloading /Users/jenkins/minikube-integration/17822-999/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/17822-999/.minikube/cache/iso/amd64/minikube-v1.32.1-1702708929-17806-amd64.iso...
	I1218 15:37:06.732214    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | I1218 15:37:06.732147    8037 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/id_rsa...
	I1218 15:37:07.049752    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | I1218 15:37:07.049656    8037 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/default-k8s-diff-port-748000.rawdisk...
	I1218 15:37:07.049773    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Writing magic tar header
	I1218 15:37:07.049786    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Writing SSH key tar header
	I1218 15:37:07.050518    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | I1218 15:37:07.050429    8037 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000 ...
	I1218 15:37:07.382197    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:37:07.382224    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/hyperkit.pid
	I1218 15:37:07.382237    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Using UUID 5ac929bc-9dfe-11ee-9910-f01898ef957c
	I1218 15:37:07.407620    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Generated MAC f2:66:e9:8a:57:73
	I1218 15:37:07.407638    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=default-k8s-diff-port-748000
	I1218 15:37:07.407758    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:37:07 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"5ac929bc-9dfe-11ee-9910-f01898ef957c", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00011b1d0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/bzimage", Initrd:"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0,
Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I1218 15:37:07.407806    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:37:07 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"5ac929bc-9dfe-11ee-9910-f01898ef957c", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00011b1d0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/bzimage", Initrd:"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0,
Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I1218 15:37:07.407839    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:37:07 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "5ac929bc-9dfe-11ee-9910-f01898ef957c", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/default-k8s-diff-port-748000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/tty,log=/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/17822-999/.minik
ube/machines/default-k8s-diff-port-748000/bzimage,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=default-k8s-diff-port-748000"}
	I1218 15:37:07.407879    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:37:07 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 5ac929bc-9dfe-11ee-9910-f01898ef957c -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/default-k8s-diff-port-748000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/tty,log=/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/console-ring -f kexec,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/bzimage,/Users/jenki
ns/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=default-k8s-diff-port-748000"
	I1218 15:37:07.407890    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:37:07 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I1218 15:37:07.410552    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:37:07 DEBUG: hyperkit: Pid is 8038
	I1218 15:37:07.410951    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Attempt 0
	I1218 15:37:07.410963    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:37:07.411054    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | hyperkit pid from json: 8038
	I1218 15:37:07.412057    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Searching for f2:66:e9:8a:57:73 in /var/db/dhcpd_leases ...
	I1218 15:37:07.412163    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Found 42 entries in /var/db/dhcpd_leases!
	I1218 15:37:07.412179    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.41 HWAddress:5a:bb:86:81:7e:3b ID:1,5a:bb:86:81:7e:3b Lease:0x6580d79d}
	I1218 15:37:07.412192    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.43 HWAddress:ae:ea:f0:6a:c1:e9 ID:1,ae:ea:f0:6a:c1:e9 Lease:0x6582282a}
	I1218 15:37:07.412199    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.42 HWAddress:e2:61:5d:1b:8b:e ID:1,e2:61:5d:1b:8b:e Lease:0x65822769}
	I1218 15:37:07.412212    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.40 HWAddress:be:ae:7:4a:b1:e7 ID:1,be:ae:7:4a:b1:e7 Lease:0x6582266c}
	I1218 15:37:07.412219    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.39 HWAddress:e:b8:aa:f6:3d:6 ID:1,e:b8:aa:f6:3d:6 Lease:0x6582261f}
	I1218 15:37:07.412227    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.38 HWAddress:ea:97:1e:7c:d6:8 ID:1,ea:97:1e:7c:d6:8 Lease:0x65822611}
	I1218 15:37:07.412235    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.37 HWAddress:e6:c4:e5:78:db:a5 ID:1,e6:c4:e5:78:db:a5 Lease:0x658225cd}
	I1218 15:37:07.412245    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.36 HWAddress:a:27:22:b0:d3:1 ID:1,a:27:22:b0:d3:1 Lease:0x658225c1}
	I1218 15:37:07.412287    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.35 HWAddress:8e:cf:da:63:d8:fe ID:1,8e:cf:da:63:d8:fe Lease:0x65822569}
	I1218 15:37:07.412305    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.34 HWAddress:6a:31:a1:4d:bb:f5 ID:1,6a:31:a1:4d:bb:f5 Lease:0x65822553}
	I1218 15:37:07.412327    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.33 HWAddress:6:c6:98:eb:ea:2c ID:1,6:c6:98:eb:ea:2c Lease:0x6582250a}
	I1218 15:37:07.412344    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.32 HWAddress:3e:79:5c:d9:7a:33 ID:1,3e:79:5c:d9:7a:33 Lease:0x658224d9}
	I1218 15:37:07.412383    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.31 HWAddress:52:2e:d4:42:e7:3c ID:1,52:2e:d4:42:e7:3c Lease:0x6580d37e}
	I1218 15:37:07.412407    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.30 HWAddress:ba:66:f5:b8:78:5d ID:1,ba:66:f5:b8:78:5d Lease:0x6580d339}
	I1218 15:37:07.412426    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.29 HWAddress:da:74:f2:6a:f8:f1 ID:1,da:74:f2:6a:f8:f1 Lease:0x65822482}
	I1218 15:37:07.412449    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.28 HWAddress:46:30:e8:d5:b4:79 ID:1,46:30:e8:d5:b4:79 Lease:0x6582244f}
	I1218 15:37:07.412467    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.27 HWAddress:92:b7:f6:53:5e:35 ID:1,92:b7:f6:53:5e:35 Lease:0x6580d2f7}
	I1218 15:37:07.412485    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.26 HWAddress:76:7f:aa:f1:b6:93 ID:1,76:7f:aa:f1:b6:93 Lease:0x6582230f}
	I1218 15:37:07.412503    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:e:d0:af:3b:93:af ID:1,e:d0:af:3b:93:af Lease:0x658222da}
	I1218 15:37:07.412519    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:7e:bb:25:ab:d4:54 ID:1,7e:bb:25:ab:d4:54 Lease:0x658222cd}
	I1218 15:37:07.412552    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:96:7b:cc:a9:f5:5a ID:1,96:7b:cc:a9:f5:5a Lease:0x6580d150}
	I1218 15:37:07.412569    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:6e:2c:bf:e5:d4:c8 ID:1,6e:2c:bf:e5:d4:c8 Lease:0x6580d142}
	I1218 15:37:07.412599    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:36:d8:32:a:3c:31 ID:1,36:d8:32:a:3c:31 Lease:0x65822280}
	I1218 15:37:07.412618    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:62:20:33:b0:56:e5 ID:1,62:20:33:b0:56:e5 Lease:0x65822268}
	I1218 15:37:07.412635    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:6e:47:1e:5f:3e:2 ID:1,6e:47:1e:5f:3e:2 Lease:0x658221f9}
	I1218 15:37:07.412670    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ae:34:98:ff:97:8d ID:1,ae:34:98:ff:97:8d Lease:0x6582218f}
	I1218 15:37:07.412692    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:ce:68:21:fe:24:bd ID:1,ce:68:21:fe:24:bd Lease:0x65822141}
	I1218 15:37:07.412708    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:6e:45:2:96:10:89 ID:1,6e:45:2:96:10:89 Lease:0x658220ac}
	I1218 15:37:07.412723    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:90:9b:fb:6a:2d ID:1,66:90:9b:fb:6a:2d Lease:0x6580ceb2}
	I1218 15:37:07.412736    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:e6:18:c3:b6:10:29 ID:1,e6:18:c3:b6:10:29 Lease:0x65822080}
	I1218 15:37:07.412750    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:7e:6c:62:57:be:53 ID:1,7e:6c:62:57:be:53 Lease:0x6582204c}
	I1218 15:37:07.412770    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:a:e:6c:4:f9:97 ID:1,a:e:6c:4:f9:97 Lease:0x6580cd3c}
	I1218 15:37:07.412784    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:c2:cb:5b:6d:94:c5 ID:1,c2:cb:5b:6d:94:c5 Lease:0x6580cd26}
	I1218 15:37:07.412793    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:fa:44:a:25:8e:6b ID:1,fa:44:a:25:8e:6b Lease:0x65821e5d}
	I1218 15:37:07.412801    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:1a:fd:bf:11:b7:85 ID:1,1a:fd:bf:11:b7:85 Lease:0x65821e36}
	I1218 15:37:07.412810    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:26:1d:41:79:69:22 ID:1,26:1d:41:79:69:22 Lease:0x65821dfa}
	I1218 15:37:07.412816    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:32:57:7:64:35:d8 ID:1,32:57:7:64:35:d8 Lease:0x65821d79}
	I1218 15:37:07.412824    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:16:1a:98:4e:b8:45 ID:1,16:1a:98:4e:b8:45 Lease:0x65821d5d}
	I1218 15:37:07.412831    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:4e:ee:a9:b6:7e:85 ID:1,4e:ee:a9:b6:7e:85 Lease:0x65821c78}
	I1218 15:37:07.412843    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:3a:b1:4e:f8:70:be ID:1,3a:b1:4e:f8:70:be Lease:0x6580caec}
	I1218 15:37:07.412859    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:2e:85:f3:9a:3:e4 ID:1,2e:85:f3:9a:3:e4 Lease:0x65821b3b}
	I1218 15:37:07.412870    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name: IPAddress:192.169.0.2 HWAddress:82:2b:f6:5b:7f:bf ID:1,82:2b:f6:5b:7f:bf Lease:0x658219d1}
	I1218 15:37:07.417909    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:37:07 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I1218 15:37:07.426414    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:37:07 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I1218 15:37:07.427137    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:37:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I1218 15:37:07.427180    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:37:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I1218 15:37:07.427196    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:37:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I1218 15:37:07.427209    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:37:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I1218 15:37:07.796114    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:37:07 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I1218 15:37:07.796131    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:37:07 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I1218 15:37:07.900195    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:37:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I1218 15:37:07.900215    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:37:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I1218 15:37:07.900225    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:37:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I1218 15:37:07.900232    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:37:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I1218 15:37:07.901071    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:37:07 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I1218 15:37:07.901082    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:37:07 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I1218 15:37:09.413293    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Attempt 1
	I1218 15:37:09.413310    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:37:09.413417    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | hyperkit pid from json: 8038
	I1218 15:37:09.414289    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Searching for f2:66:e9:8a:57:73 in /var/db/dhcpd_leases ...
	I1218 15:37:09.414360    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Found 42 entries in /var/db/dhcpd_leases!
	I1218 15:37:09.414370    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.41 HWAddress:5a:bb:86:81:7e:3b ID:1,5a:bb:86:81:7e:3b Lease:0x6580d79d}
	I1218 15:37:09.414404    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.43 HWAddress:ae:ea:f0:6a:c1:e9 ID:1,ae:ea:f0:6a:c1:e9 Lease:0x6582282a}
	I1218 15:37:09.414420    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.42 HWAddress:e2:61:5d:1b:8b:e ID:1,e2:61:5d:1b:8b:e Lease:0x65822769}
	I1218 15:37:09.414432    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.40 HWAddress:be:ae:7:4a:b1:e7 ID:1,be:ae:7:4a:b1:e7 Lease:0x6582266c}
	I1218 15:37:09.414450    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.39 HWAddress:e:b8:aa:f6:3d:6 ID:1,e:b8:aa:f6:3d:6 Lease:0x6582261f}
	I1218 15:37:09.414458    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.38 HWAddress:ea:97:1e:7c:d6:8 ID:1,ea:97:1e:7c:d6:8 Lease:0x65822611}
	I1218 15:37:09.414466    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.37 HWAddress:e6:c4:e5:78:db:a5 ID:1,e6:c4:e5:78:db:a5 Lease:0x658225cd}
	I1218 15:37:09.414474    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.36 HWAddress:a:27:22:b0:d3:1 ID:1,a:27:22:b0:d3:1 Lease:0x658225c1}
	I1218 15:37:09.414481    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.35 HWAddress:8e:cf:da:63:d8:fe ID:1,8e:cf:da:63:d8:fe Lease:0x65822569}
	I1218 15:37:09.414489    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.34 HWAddress:6a:31:a1:4d:bb:f5 ID:1,6a:31:a1:4d:bb:f5 Lease:0x65822553}
	I1218 15:37:09.414496    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.33 HWAddress:6:c6:98:eb:ea:2c ID:1,6:c6:98:eb:ea:2c Lease:0x6582250a}
	I1218 15:37:09.414507    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.32 HWAddress:3e:79:5c:d9:7a:33 ID:1,3e:79:5c:d9:7a:33 Lease:0x658224d9}
	I1218 15:37:09.414517    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.31 HWAddress:52:2e:d4:42:e7:3c ID:1,52:2e:d4:42:e7:3c Lease:0x6580d37e}
	I1218 15:37:09.414524    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.30 HWAddress:ba:66:f5:b8:78:5d ID:1,ba:66:f5:b8:78:5d Lease:0x6580d339}
	I1218 15:37:09.414533    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.29 HWAddress:da:74:f2:6a:f8:f1 ID:1,da:74:f2:6a:f8:f1 Lease:0x65822482}
	I1218 15:37:09.414549    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.28 HWAddress:46:30:e8:d5:b4:79 ID:1,46:30:e8:d5:b4:79 Lease:0x6582244f}
	I1218 15:37:09.414562    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.27 HWAddress:92:b7:f6:53:5e:35 ID:1,92:b7:f6:53:5e:35 Lease:0x6580d2f7}
	I1218 15:37:09.414576    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.26 HWAddress:76:7f:aa:f1:b6:93 ID:1,76:7f:aa:f1:b6:93 Lease:0x6582230f}
	I1218 15:37:09.414588    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:e:d0:af:3b:93:af ID:1,e:d0:af:3b:93:af Lease:0x658222da}
	I1218 15:37:09.414596    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:7e:bb:25:ab:d4:54 ID:1,7e:bb:25:ab:d4:54 Lease:0x658222cd}
	I1218 15:37:09.414610    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:96:7b:cc:a9:f5:5a ID:1,96:7b:cc:a9:f5:5a Lease:0x6580d150}
	I1218 15:37:09.414623    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:6e:2c:bf:e5:d4:c8 ID:1,6e:2c:bf:e5:d4:c8 Lease:0x6580d142}
	I1218 15:37:09.414636    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:36:d8:32:a:3c:31 ID:1,36:d8:32:a:3c:31 Lease:0x65822280}
	I1218 15:37:09.414644    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:62:20:33:b0:56:e5 ID:1,62:20:33:b0:56:e5 Lease:0x65822268}
	I1218 15:37:09.414653    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:6e:47:1e:5f:3e:2 ID:1,6e:47:1e:5f:3e:2 Lease:0x658221f9}
	I1218 15:37:09.414661    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ae:34:98:ff:97:8d ID:1,ae:34:98:ff:97:8d Lease:0x6582218f}
	I1218 15:37:09.414672    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:ce:68:21:fe:24:bd ID:1,ce:68:21:fe:24:bd Lease:0x65822141}
	I1218 15:37:09.414690    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:6e:45:2:96:10:89 ID:1,6e:45:2:96:10:89 Lease:0x658220ac}
	I1218 15:37:09.414703    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:90:9b:fb:6a:2d ID:1,66:90:9b:fb:6a:2d Lease:0x6580ceb2}
	I1218 15:37:09.414712    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:e6:18:c3:b6:10:29 ID:1,e6:18:c3:b6:10:29 Lease:0x65822080}
	I1218 15:37:09.414721    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:7e:6c:62:57:be:53 ID:1,7e:6c:62:57:be:53 Lease:0x6582204c}
	I1218 15:37:09.414730    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:a:e:6c:4:f9:97 ID:1,a:e:6c:4:f9:97 Lease:0x6580cd3c}
	I1218 15:37:09.414743    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:c2:cb:5b:6d:94:c5 ID:1,c2:cb:5b:6d:94:c5 Lease:0x6580cd26}
	I1218 15:37:09.414755    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:fa:44:a:25:8e:6b ID:1,fa:44:a:25:8e:6b Lease:0x65821e5d}
	I1218 15:37:09.414765    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:1a:fd:bf:11:b7:85 ID:1,1a:fd:bf:11:b7:85 Lease:0x65821e36}
	I1218 15:37:09.414782    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:26:1d:41:79:69:22 ID:1,26:1d:41:79:69:22 Lease:0x65821dfa}
	I1218 15:37:09.414795    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:32:57:7:64:35:d8 ID:1,32:57:7:64:35:d8 Lease:0x65821d79}
	I1218 15:37:09.414804    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:16:1a:98:4e:b8:45 ID:1,16:1a:98:4e:b8:45 Lease:0x65821d5d}
	I1218 15:37:09.414813    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:4e:ee:a9:b6:7e:85 ID:1,4e:ee:a9:b6:7e:85 Lease:0x65821c78}
	I1218 15:37:09.414821    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:3a:b1:4e:f8:70:be ID:1,3a:b1:4e:f8:70:be Lease:0x6580caec}
	I1218 15:37:09.414834    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:2e:85:f3:9a:3:e4 ID:1,2e:85:f3:9a:3:e4 Lease:0x65821b3b}
	I1218 15:37:09.414845    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name: IPAddress:192.169.0.2 HWAddress:82:2b:f6:5b:7f:bf ID:1,82:2b:f6:5b:7f:bf Lease:0x658219d1}
	I1218 15:37:11.414998    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Attempt 2
	I1218 15:37:11.415015    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:37:11.415087    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | hyperkit pid from json: 8038
	I1218 15:37:11.415912    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Searching for f2:66:e9:8a:57:73 in /var/db/dhcpd_leases ...
	I1218 15:37:11.415989    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Found 42 entries in /var/db/dhcpd_leases!
	I1218 15:37:11.416005    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.41 HWAddress:5a:bb:86:81:7e:3b ID:1,5a:bb:86:81:7e:3b Lease:0x6580d79d}
	I1218 15:37:11.416015    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.43 HWAddress:ae:ea:f0:6a:c1:e9 ID:1,ae:ea:f0:6a:c1:e9 Lease:0x6582282a}
	I1218 15:37:11.416023    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.42 HWAddress:e2:61:5d:1b:8b:e ID:1,e2:61:5d:1b:8b:e Lease:0x65822769}
	I1218 15:37:11.416030    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.40 HWAddress:be:ae:7:4a:b1:e7 ID:1,be:ae:7:4a:b1:e7 Lease:0x6582266c}
	I1218 15:37:11.416038    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.39 HWAddress:e:b8:aa:f6:3d:6 ID:1,e:b8:aa:f6:3d:6 Lease:0x6582261f}
	I1218 15:37:11.416048    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.38 HWAddress:ea:97:1e:7c:d6:8 ID:1,ea:97:1e:7c:d6:8 Lease:0x65822611}
	I1218 15:37:11.416057    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.37 HWAddress:e6:c4:e5:78:db:a5 ID:1,e6:c4:e5:78:db:a5 Lease:0x658225cd}
	I1218 15:37:11.416065    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.36 HWAddress:a:27:22:b0:d3:1 ID:1,a:27:22:b0:d3:1 Lease:0x658225c1}
	I1218 15:37:11.416072    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.35 HWAddress:8e:cf:da:63:d8:fe ID:1,8e:cf:da:63:d8:fe Lease:0x65822569}
	I1218 15:37:11.416080    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.34 HWAddress:6a:31:a1:4d:bb:f5 ID:1,6a:31:a1:4d:bb:f5 Lease:0x65822553}
	I1218 15:37:11.416087    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.33 HWAddress:6:c6:98:eb:ea:2c ID:1,6:c6:98:eb:ea:2c Lease:0x6582250a}
	I1218 15:37:11.416094    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.32 HWAddress:3e:79:5c:d9:7a:33 ID:1,3e:79:5c:d9:7a:33 Lease:0x658224d9}
	I1218 15:37:11.416103    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.31 HWAddress:52:2e:d4:42:e7:3c ID:1,52:2e:d4:42:e7:3c Lease:0x6580d37e}
	I1218 15:37:11.416110    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.30 HWAddress:ba:66:f5:b8:78:5d ID:1,ba:66:f5:b8:78:5d Lease:0x6580d339}
	I1218 15:37:11.416117    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.29 HWAddress:da:74:f2:6a:f8:f1 ID:1,da:74:f2:6a:f8:f1 Lease:0x65822482}
	I1218 15:37:11.416126    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.28 HWAddress:46:30:e8:d5:b4:79 ID:1,46:30:e8:d5:b4:79 Lease:0x6582244f}
	I1218 15:37:11.416136    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.27 HWAddress:92:b7:f6:53:5e:35 ID:1,92:b7:f6:53:5e:35 Lease:0x6580d2f7}
	I1218 15:37:11.416143    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.26 HWAddress:76:7f:aa:f1:b6:93 ID:1,76:7f:aa:f1:b6:93 Lease:0x6582230f}
	I1218 15:37:11.416158    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:e:d0:af:3b:93:af ID:1,e:d0:af:3b:93:af Lease:0x658222da}
	I1218 15:37:11.416167    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:7e:bb:25:ab:d4:54 ID:1,7e:bb:25:ab:d4:54 Lease:0x658222cd}
	I1218 15:37:11.416175    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:96:7b:cc:a9:f5:5a ID:1,96:7b:cc:a9:f5:5a Lease:0x6580d150}
	I1218 15:37:11.416184    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:6e:2c:bf:e5:d4:c8 ID:1,6e:2c:bf:e5:d4:c8 Lease:0x6580d142}
	I1218 15:37:11.416192    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:36:d8:32:a:3c:31 ID:1,36:d8:32:a:3c:31 Lease:0x65822280}
	I1218 15:37:11.416201    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:62:20:33:b0:56:e5 ID:1,62:20:33:b0:56:e5 Lease:0x65822268}
	I1218 15:37:11.416216    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:6e:47:1e:5f:3e:2 ID:1,6e:47:1e:5f:3e:2 Lease:0x658221f9}
	I1218 15:37:11.416225    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ae:34:98:ff:97:8d ID:1,ae:34:98:ff:97:8d Lease:0x6582218f}
	I1218 15:37:11.416233    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:ce:68:21:fe:24:bd ID:1,ce:68:21:fe:24:bd Lease:0x65822141}
	I1218 15:37:11.416242    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:6e:45:2:96:10:89 ID:1,6e:45:2:96:10:89 Lease:0x658220ac}
	I1218 15:37:11.416250    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:90:9b:fb:6a:2d ID:1,66:90:9b:fb:6a:2d Lease:0x6580ceb2}
	I1218 15:37:11.416258    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:e6:18:c3:b6:10:29 ID:1,e6:18:c3:b6:10:29 Lease:0x65822080}
	I1218 15:37:11.416266    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:7e:6c:62:57:be:53 ID:1,7e:6c:62:57:be:53 Lease:0x6582204c}
	I1218 15:37:11.416273    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:a:e:6c:4:f9:97 ID:1,a:e:6c:4:f9:97 Lease:0x6580cd3c}
	I1218 15:37:11.416281    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:c2:cb:5b:6d:94:c5 ID:1,c2:cb:5b:6d:94:c5 Lease:0x6580cd26}
	I1218 15:37:11.416290    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:fa:44:a:25:8e:6b ID:1,fa:44:a:25:8e:6b Lease:0x65821e5d}
	I1218 15:37:11.416307    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:1a:fd:bf:11:b7:85 ID:1,1a:fd:bf:11:b7:85 Lease:0x65821e36}
	I1218 15:37:11.416321    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:26:1d:41:79:69:22 ID:1,26:1d:41:79:69:22 Lease:0x65821dfa}
	I1218 15:37:11.416336    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:32:57:7:64:35:d8 ID:1,32:57:7:64:35:d8 Lease:0x65821d79}
	I1218 15:37:11.416346    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:16:1a:98:4e:b8:45 ID:1,16:1a:98:4e:b8:45 Lease:0x65821d5d}
	I1218 15:37:11.416354    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:4e:ee:a9:b6:7e:85 ID:1,4e:ee:a9:b6:7e:85 Lease:0x65821c78}
	I1218 15:37:11.416361    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:3a:b1:4e:f8:70:be ID:1,3a:b1:4e:f8:70:be Lease:0x6580caec}
	I1218 15:37:11.416369    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:2e:85:f3:9a:3:e4 ID:1,2e:85:f3:9a:3:e4 Lease:0x65821b3b}
	I1218 15:37:11.416378    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name: IPAddress:192.169.0.2 HWAddress:82:2b:f6:5b:7f:bf ID:1,82:2b:f6:5b:7f:bf Lease:0x658219d1}
	I1218 15:37:12.903901    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:37:12 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I1218 15:37:12.903916    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:37:12 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I1218 15:37:12.903933    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:37:12 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I1218 15:37:13.416259    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Attempt 3
	I1218 15:37:13.416277    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:37:13.416396    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | hyperkit pid from json: 8038
	I1218 15:37:13.417279    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Searching for f2:66:e9:8a:57:73 in /var/db/dhcpd_leases ...
	I1218 15:37:13.417369    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Found 42 entries in /var/db/dhcpd_leases!
	I1218 15:37:13.417380    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.41 HWAddress:5a:bb:86:81:7e:3b ID:1,5a:bb:86:81:7e:3b Lease:0x6580d79d}
	I1218 15:37:13.417407    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.43 HWAddress:ae:ea:f0:6a:c1:e9 ID:1,ae:ea:f0:6a:c1:e9 Lease:0x6582282a}
	I1218 15:37:13.417417    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.42 HWAddress:e2:61:5d:1b:8b:e ID:1,e2:61:5d:1b:8b:e Lease:0x65822769}
	I1218 15:37:13.417426    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.40 HWAddress:be:ae:7:4a:b1:e7 ID:1,be:ae:7:4a:b1:e7 Lease:0x6582266c}
	I1218 15:37:13.417433    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.39 HWAddress:e:b8:aa:f6:3d:6 ID:1,e:b8:aa:f6:3d:6 Lease:0x6582261f}
	I1218 15:37:13.417448    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.38 HWAddress:ea:97:1e:7c:d6:8 ID:1,ea:97:1e:7c:d6:8 Lease:0x65822611}
	I1218 15:37:13.417462    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.37 HWAddress:e6:c4:e5:78:db:a5 ID:1,e6:c4:e5:78:db:a5 Lease:0x658225cd}
	I1218 15:37:13.417475    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.36 HWAddress:a:27:22:b0:d3:1 ID:1,a:27:22:b0:d3:1 Lease:0x658225c1}
	I1218 15:37:13.417483    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.35 HWAddress:8e:cf:da:63:d8:fe ID:1,8e:cf:da:63:d8:fe Lease:0x65822569}
	I1218 15:37:13.417492    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.34 HWAddress:6a:31:a1:4d:bb:f5 ID:1,6a:31:a1:4d:bb:f5 Lease:0x65822553}
	I1218 15:37:13.417499    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.33 HWAddress:6:c6:98:eb:ea:2c ID:1,6:c6:98:eb:ea:2c Lease:0x6582250a}
	I1218 15:37:13.417519    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.32 HWAddress:3e:79:5c:d9:7a:33 ID:1,3e:79:5c:d9:7a:33 Lease:0x658224d9}
	I1218 15:37:13.417529    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.31 HWAddress:52:2e:d4:42:e7:3c ID:1,52:2e:d4:42:e7:3c Lease:0x6580d37e}
	I1218 15:37:13.417536    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.30 HWAddress:ba:66:f5:b8:78:5d ID:1,ba:66:f5:b8:78:5d Lease:0x6580d339}
	I1218 15:37:13.417544    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.29 HWAddress:da:74:f2:6a:f8:f1 ID:1,da:74:f2:6a:f8:f1 Lease:0x65822482}
	I1218 15:37:13.417559    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.28 HWAddress:46:30:e8:d5:b4:79 ID:1,46:30:e8:d5:b4:79 Lease:0x6582244f}
	I1218 15:37:13.417572    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.27 HWAddress:92:b7:f6:53:5e:35 ID:1,92:b7:f6:53:5e:35 Lease:0x6580d2f7}
	I1218 15:37:13.417581    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.26 HWAddress:76:7f:aa:f1:b6:93 ID:1,76:7f:aa:f1:b6:93 Lease:0x6582230f}
	I1218 15:37:13.417590    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:e:d0:af:3b:93:af ID:1,e:d0:af:3b:93:af Lease:0x658222da}
	I1218 15:37:13.417599    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:7e:bb:25:ab:d4:54 ID:1,7e:bb:25:ab:d4:54 Lease:0x658222cd}
	I1218 15:37:13.417609    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:96:7b:cc:a9:f5:5a ID:1,96:7b:cc:a9:f5:5a Lease:0x6580d150}
	I1218 15:37:13.417617    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:6e:2c:bf:e5:d4:c8 ID:1,6e:2c:bf:e5:d4:c8 Lease:0x6580d142}
	I1218 15:37:13.417625    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:36:d8:32:a:3c:31 ID:1,36:d8:32:a:3c:31 Lease:0x65822280}
	I1218 15:37:13.417637    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:62:20:33:b0:56:e5 ID:1,62:20:33:b0:56:e5 Lease:0x65822268}
	I1218 15:37:13.417651    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:6e:47:1e:5f:3e:2 ID:1,6e:47:1e:5f:3e:2 Lease:0x658221f9}
	I1218 15:37:13.417674    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ae:34:98:ff:97:8d ID:1,ae:34:98:ff:97:8d Lease:0x6582218f}
	I1218 15:37:13.417690    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:ce:68:21:fe:24:bd ID:1,ce:68:21:fe:24:bd Lease:0x65822141}
	I1218 15:37:13.417703    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:6e:45:2:96:10:89 ID:1,6e:45:2:96:10:89 Lease:0x658220ac}
	I1218 15:37:13.417713    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:90:9b:fb:6a:2d ID:1,66:90:9b:fb:6a:2d Lease:0x6580ceb2}
	I1218 15:37:13.417720    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:e6:18:c3:b6:10:29 ID:1,e6:18:c3:b6:10:29 Lease:0x65822080}
	I1218 15:37:13.417727    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:7e:6c:62:57:be:53 ID:1,7e:6c:62:57:be:53 Lease:0x6582204c}
	I1218 15:37:13.417736    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:a:e:6c:4:f9:97 ID:1,a:e:6c:4:f9:97 Lease:0x6580cd3c}
	I1218 15:37:13.417749    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:c2:cb:5b:6d:94:c5 ID:1,c2:cb:5b:6d:94:c5 Lease:0x6580cd26}
	I1218 15:37:13.417757    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:fa:44:a:25:8e:6b ID:1,fa:44:a:25:8e:6b Lease:0x65821e5d}
	I1218 15:37:13.417767    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:1a:fd:bf:11:b7:85 ID:1,1a:fd:bf:11:b7:85 Lease:0x65821e36}
	I1218 15:37:13.417774    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:26:1d:41:79:69:22 ID:1,26:1d:41:79:69:22 Lease:0x65821dfa}
	I1218 15:37:13.417783    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:32:57:7:64:35:d8 ID:1,32:57:7:64:35:d8 Lease:0x65821d79}
	I1218 15:37:13.417791    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:16:1a:98:4e:b8:45 ID:1,16:1a:98:4e:b8:45 Lease:0x65821d5d}
	I1218 15:37:13.417800    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:4e:ee:a9:b6:7e:85 ID:1,4e:ee:a9:b6:7e:85 Lease:0x65821c78}
	I1218 15:37:13.417808    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:3a:b1:4e:f8:70:be ID:1,3a:b1:4e:f8:70:be Lease:0x6580caec}
	I1218 15:37:13.417816    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:2e:85:f3:9a:3:e4 ID:1,2e:85:f3:9a:3:e4 Lease:0x65821b3b}
	I1218 15:37:13.417825    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name: IPAddress:192.169.0.2 HWAddress:82:2b:f6:5b:7f:bf ID:1,82:2b:f6:5b:7f:bf Lease:0x658219d1}
	I1218 15:37:15.418664    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Attempt 4
	I1218 15:37:15.418689    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:37:15.418788    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | hyperkit pid from json: 8038
	I1218 15:37:15.419721    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Searching for f2:66:e9:8a:57:73 in /var/db/dhcpd_leases ...
	I1218 15:37:15.419814    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Found 42 entries in /var/db/dhcpd_leases!
	I1218 15:37:15.419831    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.41 HWAddress:5a:bb:86:81:7e:3b ID:1,5a:bb:86:81:7e:3b Lease:0x6580d79d}
	I1218 15:37:15.419861    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.43 HWAddress:ae:ea:f0:6a:c1:e9 ID:1,ae:ea:f0:6a:c1:e9 Lease:0x6582282a}
	I1218 15:37:15.419873    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.42 HWAddress:e2:61:5d:1b:8b:e ID:1,e2:61:5d:1b:8b:e Lease:0x65822769}
	I1218 15:37:15.419890    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.40 HWAddress:be:ae:7:4a:b1:e7 ID:1,be:ae:7:4a:b1:e7 Lease:0x6582266c}
	I1218 15:37:15.419911    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.39 HWAddress:e:b8:aa:f6:3d:6 ID:1,e:b8:aa:f6:3d:6 Lease:0x6582261f}
	I1218 15:37:15.419921    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.38 HWAddress:ea:97:1e:7c:d6:8 ID:1,ea:97:1e:7c:d6:8 Lease:0x65822611}
	I1218 15:37:15.419930    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.37 HWAddress:e6:c4:e5:78:db:a5 ID:1,e6:c4:e5:78:db:a5 Lease:0x658225cd}
	I1218 15:37:15.419942    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.36 HWAddress:a:27:22:b0:d3:1 ID:1,a:27:22:b0:d3:1 Lease:0x658225c1}
	I1218 15:37:15.419952    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.35 HWAddress:8e:cf:da:63:d8:fe ID:1,8e:cf:da:63:d8:fe Lease:0x65822569}
	I1218 15:37:15.419967    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.34 HWAddress:6a:31:a1:4d:bb:f5 ID:1,6a:31:a1:4d:bb:f5 Lease:0x65822553}
	I1218 15:37:15.419988    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.33 HWAddress:6:c6:98:eb:ea:2c ID:1,6:c6:98:eb:ea:2c Lease:0x6582250a}
	I1218 15:37:15.420004    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.32 HWAddress:3e:79:5c:d9:7a:33 ID:1,3e:79:5c:d9:7a:33 Lease:0x658224d9}
	I1218 15:37:15.420015    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.31 HWAddress:52:2e:d4:42:e7:3c ID:1,52:2e:d4:42:e7:3c Lease:0x6580d37e}
	I1218 15:37:15.420035    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.30 HWAddress:ba:66:f5:b8:78:5d ID:1,ba:66:f5:b8:78:5d Lease:0x6580d339}
	I1218 15:37:15.420059    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.29 HWAddress:da:74:f2:6a:f8:f1 ID:1,da:74:f2:6a:f8:f1 Lease:0x65822482}
	I1218 15:37:15.420078    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.28 HWAddress:46:30:e8:d5:b4:79 ID:1,46:30:e8:d5:b4:79 Lease:0x6582244f}
	I1218 15:37:15.420092    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.27 HWAddress:92:b7:f6:53:5e:35 ID:1,92:b7:f6:53:5e:35 Lease:0x6580d2f7}
	I1218 15:37:15.420109    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.26 HWAddress:76:7f:aa:f1:b6:93 ID:1,76:7f:aa:f1:b6:93 Lease:0x6582230f}
	I1218 15:37:15.420133    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:e:d0:af:3b:93:af ID:1,e:d0:af:3b:93:af Lease:0x658222da}
	I1218 15:37:15.420147    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:7e:bb:25:ab:d4:54 ID:1,7e:bb:25:ab:d4:54 Lease:0x658222cd}
	I1218 15:37:15.420157    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:96:7b:cc:a9:f5:5a ID:1,96:7b:cc:a9:f5:5a Lease:0x6580d150}
	I1218 15:37:15.420168    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:6e:2c:bf:e5:d4:c8 ID:1,6e:2c:bf:e5:d4:c8 Lease:0x6580d142}
	I1218 15:37:15.420178    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:36:d8:32:a:3c:31 ID:1,36:d8:32:a:3c:31 Lease:0x65822280}
	I1218 15:37:15.420187    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:62:20:33:b0:56:e5 ID:1,62:20:33:b0:56:e5 Lease:0x65822268}
	I1218 15:37:15.420213    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:6e:47:1e:5f:3e:2 ID:1,6e:47:1e:5f:3e:2 Lease:0x658221f9}
	I1218 15:37:15.420228    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ae:34:98:ff:97:8d ID:1,ae:34:98:ff:97:8d Lease:0x6582218f}
	I1218 15:37:15.420239    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:ce:68:21:fe:24:bd ID:1,ce:68:21:fe:24:bd Lease:0x65822141}
	I1218 15:37:15.420250    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:6e:45:2:96:10:89 ID:1,6e:45:2:96:10:89 Lease:0x658220ac}
	I1218 15:37:15.420260    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:90:9b:fb:6a:2d ID:1,66:90:9b:fb:6a:2d Lease:0x6580ceb2}
	I1218 15:37:15.420269    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:e6:18:c3:b6:10:29 ID:1,e6:18:c3:b6:10:29 Lease:0x65822080}
	I1218 15:37:15.420278    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:7e:6c:62:57:be:53 ID:1,7e:6c:62:57:be:53 Lease:0x6582204c}
	I1218 15:37:15.420289    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:a:e:6c:4:f9:97 ID:1,a:e:6c:4:f9:97 Lease:0x6580cd3c}
	I1218 15:37:15.420298    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:c2:cb:5b:6d:94:c5 ID:1,c2:cb:5b:6d:94:c5 Lease:0x6580cd26}
	I1218 15:37:15.420308    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:fa:44:a:25:8e:6b ID:1,fa:44:a:25:8e:6b Lease:0x65821e5d}
	I1218 15:37:15.420320    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:1a:fd:bf:11:b7:85 ID:1,1a:fd:bf:11:b7:85 Lease:0x65821e36}
	I1218 15:37:15.420331    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:26:1d:41:79:69:22 ID:1,26:1d:41:79:69:22 Lease:0x65821dfa}
	I1218 15:37:15.420340    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:32:57:7:64:35:d8 ID:1,32:57:7:64:35:d8 Lease:0x65821d79}
	I1218 15:37:15.420350    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:16:1a:98:4e:b8:45 ID:1,16:1a:98:4e:b8:45 Lease:0x65821d5d}
	I1218 15:37:15.420380    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:4e:ee:a9:b6:7e:85 ID:1,4e:ee:a9:b6:7e:85 Lease:0x65821c78}
	I1218 15:37:15.420394    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:3a:b1:4e:f8:70:be ID:1,3a:b1:4e:f8:70:be Lease:0x6580caec}
	I1218 15:37:15.420404    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:2e:85:f3:9a:3:e4 ID:1,2e:85:f3:9a:3:e4 Lease:0x65821b3b}
	I1218 15:37:15.420414    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name: IPAddress:192.169.0.2 HWAddress:82:2b:f6:5b:7f:bf ID:1,82:2b:f6:5b:7f:bf Lease:0x658219d1}
	I1218 15:37:17.420478    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Attempt 5
	I1218 15:37:17.420507    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:37:17.420618    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | hyperkit pid from json: 8038
	I1218 15:37:17.422135    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Searching for f2:66:e9:8a:57:73 in /var/db/dhcpd_leases ...
	I1218 15:37:17.422290    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Found 43 entries in /var/db/dhcpd_leases!
	I1218 15:37:17.422315    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.44 HWAddress:f2:66:e9:8a:57:73 ID:1,f2:66:e9:8a:57:73 Lease:0x6582292b}
	I1218 15:37:17.422341    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Found match: f2:66:e9:8a:57:73
	I1218 15:37:17.422364    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | IP: 192.169.0.44
	I1218 15:37:17.422422    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .GetConfigRaw
	I1218 15:37:17.423184    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .DriverName
	I1218 15:37:17.423319    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .DriverName
	I1218 15:37:17.423435    8029 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I1218 15:37:17.423453    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .GetState
	I1218 15:37:17.423560    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:37:17.423638    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | hyperkit pid from json: 8038
	I1218 15:37:17.424709    8029 main.go:141] libmachine: Detecting operating system of created instance...
	I1218 15:37:17.424721    8029 main.go:141] libmachine: Waiting for SSH to be available...
	I1218 15:37:17.424727    8029 main.go:141] libmachine: Getting to WaitForSSH function...
	I1218 15:37:17.424733    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .GetSSHHostname
	I1218 15:37:17.424820    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .GetSSHPort
	I1218 15:37:17.424906    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .GetSSHKeyPath
	I1218 15:37:17.424993    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .GetSSHKeyPath
	I1218 15:37:17.425093    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .GetSSHUsername
	I1218 15:37:17.425202    8029 main.go:141] libmachine: Using SSH client type: native
	I1218 15:37:17.425506    8029 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1406660] 0x1409340 <nil>  [] 0s} 192.169.0.44 22 <nil> <nil>}
	I1218 15:37:17.425514    8029 main.go:141] libmachine: About to run SSH command:
	exit 0
	I1218 15:38:32.427488    8029 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.44:22: connect: operation timed out
	I1218 15:39:50.430402    8029 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.44:22: connect: operation timed out
	I1218 15:41:08.432961    8029 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.44:22: connect: operation timed out
	I1218 15:42:26.435536    8029 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.44:22: connect: operation timed out
	I1218 15:43:06.561943    8029 start.go:128] duration metric: createHost completed in 6m0.032530046s
	I1218 15:43:06.561970    8029 start.go:83] releasing machines lock for "default-k8s-diff-port-748000", held for 6m0.032625565s
	W1218 15:43:06.561999    8029 start.go:694] error starting host: creating host: create host timed out in 360.000000 seconds
	I1218 15:43:06.562562    8029 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 15:43:06.562598    8029 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 15:43:06.571493    8029 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:56725
	I1218 15:43:06.571845    8029 main.go:141] libmachine: () Calling .GetVersion
	I1218 15:43:06.572238    8029 main.go:141] libmachine: Using API Version  1
	I1218 15:43:06.572259    8029 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 15:43:06.572469    8029 main.go:141] libmachine: () Calling .GetMachineName
	I1218 15:43:06.572812    8029 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 15:43:06.572833    8029 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 15:43:06.580721    8029 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:56727
	I1218 15:43:06.581048    8029 main.go:141] libmachine: () Calling .GetVersion
	I1218 15:43:06.581385    8029 main.go:141] libmachine: Using API Version  1
	I1218 15:43:06.581396    8029 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 15:43:06.581616    8029 main.go:141] libmachine: () Calling .GetMachineName
	I1218 15:43:06.581713    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .GetState
	I1218 15:43:06.581809    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:43:06.581883    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | hyperkit pid from json: 8038
	I1218 15:43:06.582814    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .DriverName
	I1218 15:43:06.604207    8029 out.go:177] * Deleting "default-k8s-diff-port-748000" in hyperkit ...
	I1218 15:43:06.646704    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .Remove
	I1218 15:43:06.647016    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:43:06.647060    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:43:06.647090    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | hyperkit pid from json: 8038
	I1218 15:43:06.648298    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:43:06.648331    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | waiting for graceful shutdown
	I1218 15:43:06.795458    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:43:06 INFO : hyperkit: stdout: linkname /Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/tty
	I1218 15:43:06.795480    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:43:06 INFO : hyperkit: stdout: COM1 connected to /dev/ttys000
	I1218 15:43:06.805557    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:43:06 WARN : hyperkit: failed to read stdout: EOF
	I1218 15:43:06.805578    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:43:06 WARN : hyperkit: failed to read stderr: EOF
	I1218 15:43:07.648694    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:43:07.648856    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | hyperkit pid from json: 8038
	I1218 15:43:07.650441    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | hyperkit pid 8038 missing from process table
	W1218 15:43:07.770000    8029 out.go:239] ! StartHost failed, but will try again: creating host: create host timed out in 360.000000 seconds
	! StartHost failed, but will try again: creating host: create host timed out in 360.000000 seconds
	I1218 15:43:07.770018    8029 start.go:709] Will try again in 5 seconds ...
	I1218 15:43:12.770942    8029 start.go:365] acquiring machines lock for default-k8s-diff-port-748000: {Name:mk129da0b7e14236047c6f70b7fc622a9cc1d994 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I1218 15:43:12.771147    8029 start.go:369] acquired machines lock for "default-k8s-diff-port-748000" in 136.064µs
	I1218 15:43:12.771177    8029 start.go:93] Provisioning new machine with config: &{Name:default-k8s-diff-port-748000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/17806/minikube-v1.32.1-1702708929-17806-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1702920864-17822@sha256:4842b362f06b33d847d73f7ed166c93ce608f4c4cea49b711c7055fd50ebd1e0 Memory:2200 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:
22 KubernetesConfig:{KubernetesVersion:v1.28.4 ClusterName:default-k8s-diff-port-748000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8444 NodeName:} Nodes:[{Name: IP: Port:8444 KubernetesVersion:v1.28.4 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker Mo
untIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 AutoPauseInterval:1m0s GPUs:} &{Name: IP: Port:8444 KubernetesVersion:v1.28.4 ContainerRuntime:docker ControlPlane:true Worker:true}
	I1218 15:43:12.771275    8029 start.go:125] createHost starting for "" (driver="hyperkit")
	I1218 15:43:12.793131    8029 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I1218 15:43:12.793288    8029 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 15:43:12.793337    8029 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 15:43:12.803589    8029 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:56729
	I1218 15:43:12.804000    8029 main.go:141] libmachine: () Calling .GetVersion
	I1218 15:43:12.804420    8029 main.go:141] libmachine: Using API Version  1
	I1218 15:43:12.804435    8029 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 15:43:12.804661    8029 main.go:141] libmachine: () Calling .GetMachineName
	I1218 15:43:12.804769    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .GetMachineName
	I1218 15:43:12.804846    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .DriverName
	I1218 15:43:12.804938    8029 start.go:159] libmachine.API.Create for "default-k8s-diff-port-748000" (driver="hyperkit")
	I1218 15:43:12.804954    8029 client.go:168] LocalClient.Create starting
	I1218 15:43:12.804987    8029 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/17822-999/.minikube/certs/ca.pem
	I1218 15:43:12.805034    8029 main.go:141] libmachine: Decoding PEM data...
	I1218 15:43:12.805045    8029 main.go:141] libmachine: Parsing certificate...
	I1218 15:43:12.805086    8029 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/17822-999/.minikube/certs/cert.pem
	I1218 15:43:12.805123    8029 main.go:141] libmachine: Decoding PEM data...
	I1218 15:43:12.805136    8029 main.go:141] libmachine: Parsing certificate...
	I1218 15:43:12.805151    8029 main.go:141] libmachine: Running pre-create checks...
	I1218 15:43:12.805157    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .PreCreateCheck
	I1218 15:43:12.805230    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:43:12.805261    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .GetConfigRaw
	I1218 15:43:12.815341    8029 main.go:141] libmachine: Creating machine...
	I1218 15:43:12.815360    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .Create
	I1218 15:43:12.815537    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:43:12.815835    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | I1218 15:43:12.815506    8176 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/17822-999/.minikube
	I1218 15:43:12.815916    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Downloading /Users/jenkins/minikube-integration/17822-999/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/17822-999/.minikube/cache/iso/amd64/minikube-v1.32.1-1702708929-17806-amd64.iso...
	I1218 15:43:12.980751    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | I1218 15:43:12.980681    8176 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/id_rsa...
	I1218 15:43:13.037797    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | I1218 15:43:13.037702    8176 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/default-k8s-diff-port-748000.rawdisk...
	I1218 15:43:13.037813    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Writing magic tar header
	I1218 15:43:13.037829    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Writing SSH key tar header
	I1218 15:43:13.038530    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | I1218 15:43:13.038470    8176 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000 ...
	I1218 15:43:13.370780    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:43:13.370803    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/hyperkit.pid
	I1218 15:43:13.370820    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Using UUID 35162ef8-9dff-11ee-9910-f01898ef957c
	I1218 15:43:13.397205    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Generated MAC 96:14:14:3c:3b:a0
	I1218 15:43:13.397226    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=default-k8s-diff-port-748000
	I1218 15:43:13.397275    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:43:13 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"35162ef8-9dff-11ee-9910-f01898ef957c", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00009f1d0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/bzimage", Initrd:"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0,
Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I1218 15:43:13.397317    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:43:13 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"35162ef8-9dff-11ee-9910-f01898ef957c", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00009f1d0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/bzimage", Initrd:"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0,
Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I1218 15:43:13.397420    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:43:13 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "35162ef8-9dff-11ee-9910-f01898ef957c", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/default-k8s-diff-port-748000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/tty,log=/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/17822-999/.minik
ube/machines/default-k8s-diff-port-748000/bzimage,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=default-k8s-diff-port-748000"}
	I1218 15:43:13.397505    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:43:13 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 35162ef8-9dff-11ee-9910-f01898ef957c -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/default-k8s-diff-port-748000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/tty,log=/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/console-ring -f kexec,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/bzimage,/Users/jenki
ns/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=default-k8s-diff-port-748000"
	I1218 15:43:13.397535    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:43:13 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I1218 15:43:13.400616    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:43:13 DEBUG: hyperkit: Pid is 8177
	I1218 15:43:13.401060    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Attempt 0
	I1218 15:43:13.401078    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:43:13.401207    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | hyperkit pid from json: 8177
	I1218 15:43:13.402268    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Searching for 96:14:14:3c:3b:a0 in /var/db/dhcpd_leases ...
	I1218 15:43:13.402399    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Found 43 entries in /var/db/dhcpd_leases!
	I1218 15:43:13.402421    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.44 HWAddress:f2:66:e9:8a:57:73 ID:1,f2:66:e9:8a:57:73 Lease:0x6580d90a}
	I1218 15:43:13.402464    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.43 HWAddress:ae:ea:f0:6a:c1:e9 ID:1,ae:ea:f0:6a:c1:e9 Lease:0x6582282a}
	I1218 15:43:13.402486    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.41 HWAddress:5a:bb:86:81:7e:3b ID:1,5a:bb:86:81:7e:3b Lease:0x6580d79d}
	I1218 15:43:13.402517    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.42 HWAddress:e2:61:5d:1b:8b:e ID:1,e2:61:5d:1b:8b:e Lease:0x65822769}
	I1218 15:43:13.402539    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.40 HWAddress:be:ae:7:4a:b1:e7 ID:1,be:ae:7:4a:b1:e7 Lease:0x6582266c}
	I1218 15:43:13.402550    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.39 HWAddress:e:b8:aa:f6:3d:6 ID:1,e:b8:aa:f6:3d:6 Lease:0x6582261f}
	I1218 15:43:13.402566    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.38 HWAddress:ea:97:1e:7c:d6:8 ID:1,ea:97:1e:7c:d6:8 Lease:0x65822611}
	I1218 15:43:13.402582    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.37 HWAddress:e6:c4:e5:78:db:a5 ID:1,e6:c4:e5:78:db:a5 Lease:0x658225cd}
	I1218 15:43:13.402597    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.36 HWAddress:a:27:22:b0:d3:1 ID:1,a:27:22:b0:d3:1 Lease:0x658225c1}
	I1218 15:43:13.402621    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.35 HWAddress:8e:cf:da:63:d8:fe ID:1,8e:cf:da:63:d8:fe Lease:0x65822569}
	I1218 15:43:13.402639    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.34 HWAddress:6a:31:a1:4d:bb:f5 ID:1,6a:31:a1:4d:bb:f5 Lease:0x65822553}
	I1218 15:43:13.402664    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.33 HWAddress:6:c6:98:eb:ea:2c ID:1,6:c6:98:eb:ea:2c Lease:0x6582250a}
	I1218 15:43:13.402675    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.32 HWAddress:3e:79:5c:d9:7a:33 ID:1,3e:79:5c:d9:7a:33 Lease:0x658224d9}
	I1218 15:43:13.402688    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.31 HWAddress:52:2e:d4:42:e7:3c ID:1,52:2e:d4:42:e7:3c Lease:0x6580d37e}
	I1218 15:43:13.402702    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.30 HWAddress:ba:66:f5:b8:78:5d ID:1,ba:66:f5:b8:78:5d Lease:0x6580d339}
	I1218 15:43:13.402714    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.29 HWAddress:da:74:f2:6a:f8:f1 ID:1,da:74:f2:6a:f8:f1 Lease:0x65822482}
	I1218 15:43:13.402727    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.28 HWAddress:46:30:e8:d5:b4:79 ID:1,46:30:e8:d5:b4:79 Lease:0x6582244f}
	I1218 15:43:13.402737    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.27 HWAddress:92:b7:f6:53:5e:35 ID:1,92:b7:f6:53:5e:35 Lease:0x6580d2f7}
	I1218 15:43:13.402747    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.26 HWAddress:76:7f:aa:f1:b6:93 ID:1,76:7f:aa:f1:b6:93 Lease:0x6582230f}
	I1218 15:43:13.402758    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:e:d0:af:3b:93:af ID:1,e:d0:af:3b:93:af Lease:0x658222da}
	I1218 15:43:13.402769    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:7e:bb:25:ab:d4:54 ID:1,7e:bb:25:ab:d4:54 Lease:0x658222cd}
	I1218 15:43:13.402779    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:96:7b:cc:a9:f5:5a ID:1,96:7b:cc:a9:f5:5a Lease:0x6580d150}
	I1218 15:43:13.402794    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:6e:2c:bf:e5:d4:c8 ID:1,6e:2c:bf:e5:d4:c8 Lease:0x6580d142}
	I1218 15:43:13.402809    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:36:d8:32:a:3c:31 ID:1,36:d8:32:a:3c:31 Lease:0x65822280}
	I1218 15:43:13.402834    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:62:20:33:b0:56:e5 ID:1,62:20:33:b0:56:e5 Lease:0x65822268}
	I1218 15:43:13.402862    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:6e:47:1e:5f:3e:2 ID:1,6e:47:1e:5f:3e:2 Lease:0x658221f9}
	I1218 15:43:13.402881    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ae:34:98:ff:97:8d ID:1,ae:34:98:ff:97:8d Lease:0x6582218f}
	I1218 15:43:13.402899    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:ce:68:21:fe:24:bd ID:1,ce:68:21:fe:24:bd Lease:0x65822141}
	I1218 15:43:13.402915    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:6e:45:2:96:10:89 ID:1,6e:45:2:96:10:89 Lease:0x658220ac}
	I1218 15:43:13.402929    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:90:9b:fb:6a:2d ID:1,66:90:9b:fb:6a:2d Lease:0x6580ceb2}
	I1218 15:43:13.402943    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:e6:18:c3:b6:10:29 ID:1,e6:18:c3:b6:10:29 Lease:0x65822080}
	I1218 15:43:13.402957    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:7e:6c:62:57:be:53 ID:1,7e:6c:62:57:be:53 Lease:0x6582204c}
	I1218 15:43:13.402972    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:a:e:6c:4:f9:97 ID:1,a:e:6c:4:f9:97 Lease:0x6580cd3c}
	I1218 15:43:13.402985    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:c2:cb:5b:6d:94:c5 ID:1,c2:cb:5b:6d:94:c5 Lease:0x6580cd26}
	I1218 15:43:13.402997    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:fa:44:a:25:8e:6b ID:1,fa:44:a:25:8e:6b Lease:0x65821e5d}
	I1218 15:43:13.403008    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:1a:fd:bf:11:b7:85 ID:1,1a:fd:bf:11:b7:85 Lease:0x65821e36}
	I1218 15:43:13.403018    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:26:1d:41:79:69:22 ID:1,26:1d:41:79:69:22 Lease:0x65821dfa}
	I1218 15:43:13.403042    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:32:57:7:64:35:d8 ID:1,32:57:7:64:35:d8 Lease:0x65821d79}
	I1218 15:43:13.403059    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:16:1a:98:4e:b8:45 ID:1,16:1a:98:4e:b8:45 Lease:0x65821d5d}
	I1218 15:43:13.403079    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:4e:ee:a9:b6:7e:85 ID:1,4e:ee:a9:b6:7e:85 Lease:0x65821c78}
	I1218 15:43:13.403093    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:3a:b1:4e:f8:70:be ID:1,3a:b1:4e:f8:70:be Lease:0x6580caec}
	I1218 15:43:13.403102    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:2e:85:f3:9a:3:e4 ID:1,2e:85:f3:9a:3:e4 Lease:0x65821b3b}
	I1218 15:43:13.403115    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name: IPAddress:192.169.0.2 HWAddress:82:2b:f6:5b:7f:bf ID:1,82:2b:f6:5b:7f:bf Lease:0x658219d1}
	I1218 15:43:13.408067    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:43:13 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I1218 15:43:13.416918    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:43:13 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I1218 15:43:13.417769    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:43:13 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I1218 15:43:13.417791    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:43:13 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I1218 15:43:13.417802    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:43:13 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I1218 15:43:13.417813    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:43:13 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I1218 15:43:13.785938    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:43:13 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I1218 15:43:13.785955    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:43:13 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I1218 15:43:13.889993    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:43:13 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I1218 15:43:13.890014    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:43:13 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I1218 15:43:13.890031    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:43:13 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I1218 15:43:13.890041    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:43:13 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I1218 15:43:13.890937    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:43:13 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I1218 15:43:13.890951    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:43:13 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I1218 15:43:15.403029    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Attempt 1
	I1218 15:43:15.403044    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:43:15.403175    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | hyperkit pid from json: 8177
	I1218 15:43:15.403979    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Searching for 96:14:14:3c:3b:a0 in /var/db/dhcpd_leases ...
	I1218 15:43:15.404054    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Found 43 entries in /var/db/dhcpd_leases!
	I1218 15:43:15.404065    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.44 HWAddress:f2:66:e9:8a:57:73 ID:1,f2:66:e9:8a:57:73 Lease:0x6580d90a}
	I1218 15:43:15.404083    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.43 HWAddress:ae:ea:f0:6a:c1:e9 ID:1,ae:ea:f0:6a:c1:e9 Lease:0x6582282a}
	I1218 15:43:15.404095    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.41 HWAddress:5a:bb:86:81:7e:3b ID:1,5a:bb:86:81:7e:3b Lease:0x6580d79d}
	I1218 15:43:15.404120    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.42 HWAddress:e2:61:5d:1b:8b:e ID:1,e2:61:5d:1b:8b:e Lease:0x65822769}
	I1218 15:43:15.404131    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.40 HWAddress:be:ae:7:4a:b1:e7 ID:1,be:ae:7:4a:b1:e7 Lease:0x6582266c}
	I1218 15:43:15.404162    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.39 HWAddress:e:b8:aa:f6:3d:6 ID:1,e:b8:aa:f6:3d:6 Lease:0x6582261f}
	I1218 15:43:15.404171    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.38 HWAddress:ea:97:1e:7c:d6:8 ID:1,ea:97:1e:7c:d6:8 Lease:0x65822611}
	I1218 15:43:15.404179    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.37 HWAddress:e6:c4:e5:78:db:a5 ID:1,e6:c4:e5:78:db:a5 Lease:0x658225cd}
	I1218 15:43:15.404192    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.36 HWAddress:a:27:22:b0:d3:1 ID:1,a:27:22:b0:d3:1 Lease:0x658225c1}
	I1218 15:43:15.404200    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.35 HWAddress:8e:cf:da:63:d8:fe ID:1,8e:cf:da:63:d8:fe Lease:0x65822569}
	I1218 15:43:15.404209    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.34 HWAddress:6a:31:a1:4d:bb:f5 ID:1,6a:31:a1:4d:bb:f5 Lease:0x65822553}
	I1218 15:43:15.404220    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.33 HWAddress:6:c6:98:eb:ea:2c ID:1,6:c6:98:eb:ea:2c Lease:0x6582250a}
	I1218 15:43:15.404227    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.32 HWAddress:3e:79:5c:d9:7a:33 ID:1,3e:79:5c:d9:7a:33 Lease:0x658224d9}
	I1218 15:43:15.404238    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.31 HWAddress:52:2e:d4:42:e7:3c ID:1,52:2e:d4:42:e7:3c Lease:0x6580d37e}
	I1218 15:43:15.404249    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.30 HWAddress:ba:66:f5:b8:78:5d ID:1,ba:66:f5:b8:78:5d Lease:0x6580d339}
	I1218 15:43:15.404258    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.29 HWAddress:da:74:f2:6a:f8:f1 ID:1,da:74:f2:6a:f8:f1 Lease:0x65822482}
	I1218 15:43:15.404266    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.28 HWAddress:46:30:e8:d5:b4:79 ID:1,46:30:e8:d5:b4:79 Lease:0x6582244f}
	I1218 15:43:15.404275    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.27 HWAddress:92:b7:f6:53:5e:35 ID:1,92:b7:f6:53:5e:35 Lease:0x6580d2f7}
	I1218 15:43:15.404282    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.26 HWAddress:76:7f:aa:f1:b6:93 ID:1,76:7f:aa:f1:b6:93 Lease:0x6582230f}
	I1218 15:43:15.404289    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:e:d0:af:3b:93:af ID:1,e:d0:af:3b:93:af Lease:0x658222da}
	I1218 15:43:15.404297    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:7e:bb:25:ab:d4:54 ID:1,7e:bb:25:ab:d4:54 Lease:0x658222cd}
	I1218 15:43:15.404304    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:96:7b:cc:a9:f5:5a ID:1,96:7b:cc:a9:f5:5a Lease:0x6580d150}
	I1218 15:43:15.404313    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:6e:2c:bf:e5:d4:c8 ID:1,6e:2c:bf:e5:d4:c8 Lease:0x6580d142}
	I1218 15:43:15.404326    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:36:d8:32:a:3c:31 ID:1,36:d8:32:a:3c:31 Lease:0x65822280}
	I1218 15:43:15.404336    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:62:20:33:b0:56:e5 ID:1,62:20:33:b0:56:e5 Lease:0x65822268}
	I1218 15:43:15.404346    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:6e:47:1e:5f:3e:2 ID:1,6e:47:1e:5f:3e:2 Lease:0x658221f9}
	I1218 15:43:15.404355    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ae:34:98:ff:97:8d ID:1,ae:34:98:ff:97:8d Lease:0x6582218f}
	I1218 15:43:15.404363    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:ce:68:21:fe:24:bd ID:1,ce:68:21:fe:24:bd Lease:0x65822141}
	I1218 15:43:15.404373    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:6e:45:2:96:10:89 ID:1,6e:45:2:96:10:89 Lease:0x658220ac}
	I1218 15:43:15.404382    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:90:9b:fb:6a:2d ID:1,66:90:9b:fb:6a:2d Lease:0x6580ceb2}
	I1218 15:43:15.404390    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:e6:18:c3:b6:10:29 ID:1,e6:18:c3:b6:10:29 Lease:0x65822080}
	I1218 15:43:15.404398    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:7e:6c:62:57:be:53 ID:1,7e:6c:62:57:be:53 Lease:0x6582204c}
	I1218 15:43:15.404407    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:a:e:6c:4:f9:97 ID:1,a:e:6c:4:f9:97 Lease:0x6580cd3c}
	I1218 15:43:15.404417    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:c2:cb:5b:6d:94:c5 ID:1,c2:cb:5b:6d:94:c5 Lease:0x6580cd26}
	I1218 15:43:15.404425    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:fa:44:a:25:8e:6b ID:1,fa:44:a:25:8e:6b Lease:0x65821e5d}
	I1218 15:43:15.404435    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:1a:fd:bf:11:b7:85 ID:1,1a:fd:bf:11:b7:85 Lease:0x65821e36}
	I1218 15:43:15.404443    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:26:1d:41:79:69:22 ID:1,26:1d:41:79:69:22 Lease:0x65821dfa}
	I1218 15:43:15.404452    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:32:57:7:64:35:d8 ID:1,32:57:7:64:35:d8 Lease:0x65821d79}
	I1218 15:43:15.404459    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:16:1a:98:4e:b8:45 ID:1,16:1a:98:4e:b8:45 Lease:0x65821d5d}
	I1218 15:43:15.404473    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:4e:ee:a9:b6:7e:85 ID:1,4e:ee:a9:b6:7e:85 Lease:0x65821c78}
	I1218 15:43:15.404483    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:3a:b1:4e:f8:70:be ID:1,3a:b1:4e:f8:70:be Lease:0x6580caec}
	I1218 15:43:15.404496    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:2e:85:f3:9a:3:e4 ID:1,2e:85:f3:9a:3:e4 Lease:0x65821b3b}
	I1218 15:43:15.404508    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name: IPAddress:192.169.0.2 HWAddress:82:2b:f6:5b:7f:bf ID:1,82:2b:f6:5b:7f:bf Lease:0x658219d1}
	I1218 15:43:17.404391    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Attempt 2
	I1218 15:43:17.404407    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:43:17.404474    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | hyperkit pid from json: 8177
	I1218 15:43:17.405264    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Searching for 96:14:14:3c:3b:a0 in /var/db/dhcpd_leases ...
	I1218 15:43:17.405345    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Found 43 entries in /var/db/dhcpd_leases!
	I1218 15:43:17.405369    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.44 HWAddress:f2:66:e9:8a:57:73 ID:1,f2:66:e9:8a:57:73 Lease:0x6580d90a}
	I1218 15:43:17.405379    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.43 HWAddress:ae:ea:f0:6a:c1:e9 ID:1,ae:ea:f0:6a:c1:e9 Lease:0x6582282a}
	I1218 15:43:17.405412    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.41 HWAddress:5a:bb:86:81:7e:3b ID:1,5a:bb:86:81:7e:3b Lease:0x6580d79d}
	I1218 15:43:17.405423    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.42 HWAddress:e2:61:5d:1b:8b:e ID:1,e2:61:5d:1b:8b:e Lease:0x65822769}
	I1218 15:43:17.405442    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.40 HWAddress:be:ae:7:4a:b1:e7 ID:1,be:ae:7:4a:b1:e7 Lease:0x6582266c}
	I1218 15:43:17.405453    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.39 HWAddress:e:b8:aa:f6:3d:6 ID:1,e:b8:aa:f6:3d:6 Lease:0x6582261f}
	I1218 15:43:17.405468    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.38 HWAddress:ea:97:1e:7c:d6:8 ID:1,ea:97:1e:7c:d6:8 Lease:0x65822611}
	I1218 15:43:17.405479    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.37 HWAddress:e6:c4:e5:78:db:a5 ID:1,e6:c4:e5:78:db:a5 Lease:0x658225cd}
	I1218 15:43:17.405492    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.36 HWAddress:a:27:22:b0:d3:1 ID:1,a:27:22:b0:d3:1 Lease:0x658225c1}
	I1218 15:43:17.405504    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.35 HWAddress:8e:cf:da:63:d8:fe ID:1,8e:cf:da:63:d8:fe Lease:0x65822569}
	I1218 15:43:17.405513    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.34 HWAddress:6a:31:a1:4d:bb:f5 ID:1,6a:31:a1:4d:bb:f5 Lease:0x65822553}
	I1218 15:43:17.405522    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.33 HWAddress:6:c6:98:eb:ea:2c ID:1,6:c6:98:eb:ea:2c Lease:0x6582250a}
	I1218 15:43:17.405539    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.32 HWAddress:3e:79:5c:d9:7a:33 ID:1,3e:79:5c:d9:7a:33 Lease:0x658224d9}
	I1218 15:43:17.405551    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.31 HWAddress:52:2e:d4:42:e7:3c ID:1,52:2e:d4:42:e7:3c Lease:0x6580d37e}
	I1218 15:43:17.405566    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.30 HWAddress:ba:66:f5:b8:78:5d ID:1,ba:66:f5:b8:78:5d Lease:0x6580d339}
	I1218 15:43:17.405575    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.29 HWAddress:da:74:f2:6a:f8:f1 ID:1,da:74:f2:6a:f8:f1 Lease:0x65822482}
	I1218 15:43:17.405584    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.28 HWAddress:46:30:e8:d5:b4:79 ID:1,46:30:e8:d5:b4:79 Lease:0x6582244f}
	I1218 15:43:17.405593    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.27 HWAddress:92:b7:f6:53:5e:35 ID:1,92:b7:f6:53:5e:35 Lease:0x6580d2f7}
	I1218 15:43:17.405601    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.26 HWAddress:76:7f:aa:f1:b6:93 ID:1,76:7f:aa:f1:b6:93 Lease:0x6582230f}
	I1218 15:43:17.405610    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:e:d0:af:3b:93:af ID:1,e:d0:af:3b:93:af Lease:0x658222da}
	I1218 15:43:17.405618    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:7e:bb:25:ab:d4:54 ID:1,7e:bb:25:ab:d4:54 Lease:0x658222cd}
	I1218 15:43:17.405627    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:96:7b:cc:a9:f5:5a ID:1,96:7b:cc:a9:f5:5a Lease:0x6580d150}
	I1218 15:43:17.405635    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:6e:2c:bf:e5:d4:c8 ID:1,6e:2c:bf:e5:d4:c8 Lease:0x6580d142}
	I1218 15:43:17.405644    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:36:d8:32:a:3c:31 ID:1,36:d8:32:a:3c:31 Lease:0x65822280}
	I1218 15:43:17.405659    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:62:20:33:b0:56:e5 ID:1,62:20:33:b0:56:e5 Lease:0x65822268}
	I1218 15:43:17.405668    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:6e:47:1e:5f:3e:2 ID:1,6e:47:1e:5f:3e:2 Lease:0x658221f9}
	I1218 15:43:17.405678    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ae:34:98:ff:97:8d ID:1,ae:34:98:ff:97:8d Lease:0x6582218f}
	I1218 15:43:17.405688    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:ce:68:21:fe:24:bd ID:1,ce:68:21:fe:24:bd Lease:0x65822141}
	I1218 15:43:17.405698    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:6e:45:2:96:10:89 ID:1,6e:45:2:96:10:89 Lease:0x658220ac}
	I1218 15:43:17.405707    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:90:9b:fb:6a:2d ID:1,66:90:9b:fb:6a:2d Lease:0x6580ceb2}
	I1218 15:43:17.405716    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:e6:18:c3:b6:10:29 ID:1,e6:18:c3:b6:10:29 Lease:0x65822080}
	I1218 15:43:17.405725    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:7e:6c:62:57:be:53 ID:1,7e:6c:62:57:be:53 Lease:0x6582204c}
	I1218 15:43:17.405734    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:a:e:6c:4:f9:97 ID:1,a:e:6c:4:f9:97 Lease:0x6580cd3c}
	I1218 15:43:17.405742    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:c2:cb:5b:6d:94:c5 ID:1,c2:cb:5b:6d:94:c5 Lease:0x6580cd26}
	I1218 15:43:17.405750    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:fa:44:a:25:8e:6b ID:1,fa:44:a:25:8e:6b Lease:0x65821e5d}
	I1218 15:43:17.405758    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:1a:fd:bf:11:b7:85 ID:1,1a:fd:bf:11:b7:85 Lease:0x65821e36}
	I1218 15:43:17.405768    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:26:1d:41:79:69:22 ID:1,26:1d:41:79:69:22 Lease:0x65821dfa}
	I1218 15:43:17.405777    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:32:57:7:64:35:d8 ID:1,32:57:7:64:35:d8 Lease:0x65821d79}
	I1218 15:43:17.405784    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:16:1a:98:4e:b8:45 ID:1,16:1a:98:4e:b8:45 Lease:0x65821d5d}
	I1218 15:43:17.405793    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:4e:ee:a9:b6:7e:85 ID:1,4e:ee:a9:b6:7e:85 Lease:0x65821c78}
	I1218 15:43:17.405801    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:3a:b1:4e:f8:70:be ID:1,3a:b1:4e:f8:70:be Lease:0x6580caec}
	I1218 15:43:17.405809    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:2e:85:f3:9a:3:e4 ID:1,2e:85:f3:9a:3:e4 Lease:0x65821b3b}
	I1218 15:43:17.405819    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name: IPAddress:192.169.0.2 HWAddress:82:2b:f6:5b:7f:bf ID:1,82:2b:f6:5b:7f:bf Lease:0x658219d1}
	I1218 15:43:18.820868    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:43:18 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I1218 15:43:18.820947    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:43:18 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I1218 15:43:18.820958    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:43:18 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I1218 15:43:19.405675    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Attempt 3
	I1218 15:43:19.405695    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:43:19.405758    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | hyperkit pid from json: 8177
	I1218 15:43:19.406583    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Searching for 96:14:14:3c:3b:a0 in /var/db/dhcpd_leases ...
	I1218 15:43:19.406653    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Found 43 entries in /var/db/dhcpd_leases!
	I1218 15:43:19.406662    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.44 HWAddress:f2:66:e9:8a:57:73 ID:1,f2:66:e9:8a:57:73 Lease:0x6580d90a}
	I1218 15:43:19.406671    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.43 HWAddress:ae:ea:f0:6a:c1:e9 ID:1,ae:ea:f0:6a:c1:e9 Lease:0x6582282a}
	I1218 15:43:19.406679    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.41 HWAddress:5a:bb:86:81:7e:3b ID:1,5a:bb:86:81:7e:3b Lease:0x6580d79d}
	I1218 15:43:19.406687    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.42 HWAddress:e2:61:5d:1b:8b:e ID:1,e2:61:5d:1b:8b:e Lease:0x65822769}
	I1218 15:43:19.406695    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.40 HWAddress:be:ae:7:4a:b1:e7 ID:1,be:ae:7:4a:b1:e7 Lease:0x6582266c}
	I1218 15:43:19.406703    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.39 HWAddress:e:b8:aa:f6:3d:6 ID:1,e:b8:aa:f6:3d:6 Lease:0x6582261f}
	I1218 15:43:19.406718    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.38 HWAddress:ea:97:1e:7c:d6:8 ID:1,ea:97:1e:7c:d6:8 Lease:0x65822611}
	I1218 15:43:19.406727    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.37 HWAddress:e6:c4:e5:78:db:a5 ID:1,e6:c4:e5:78:db:a5 Lease:0x658225cd}
	I1218 15:43:19.406733    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.36 HWAddress:a:27:22:b0:d3:1 ID:1,a:27:22:b0:d3:1 Lease:0x658225c1}
	I1218 15:43:19.406742    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.35 HWAddress:8e:cf:da:63:d8:fe ID:1,8e:cf:da:63:d8:fe Lease:0x65822569}
	I1218 15:43:19.406749    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.34 HWAddress:6a:31:a1:4d:bb:f5 ID:1,6a:31:a1:4d:bb:f5 Lease:0x65822553}
	I1218 15:43:19.406781    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.33 HWAddress:6:c6:98:eb:ea:2c ID:1,6:c6:98:eb:ea:2c Lease:0x6582250a}
	I1218 15:43:19.406798    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.32 HWAddress:3e:79:5c:d9:7a:33 ID:1,3e:79:5c:d9:7a:33 Lease:0x658224d9}
	I1218 15:43:19.406809    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.31 HWAddress:52:2e:d4:42:e7:3c ID:1,52:2e:d4:42:e7:3c Lease:0x6580d37e}
	I1218 15:43:19.406821    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.30 HWAddress:ba:66:f5:b8:78:5d ID:1,ba:66:f5:b8:78:5d Lease:0x6580d339}
	I1218 15:43:19.406828    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.29 HWAddress:da:74:f2:6a:f8:f1 ID:1,da:74:f2:6a:f8:f1 Lease:0x65822482}
	I1218 15:43:19.406837    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.28 HWAddress:46:30:e8:d5:b4:79 ID:1,46:30:e8:d5:b4:79 Lease:0x6582244f}
	I1218 15:43:19.406845    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.27 HWAddress:92:b7:f6:53:5e:35 ID:1,92:b7:f6:53:5e:35 Lease:0x6580d2f7}
	I1218 15:43:19.406854    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.26 HWAddress:76:7f:aa:f1:b6:93 ID:1,76:7f:aa:f1:b6:93 Lease:0x6582230f}
	I1218 15:43:19.406862    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:e:d0:af:3b:93:af ID:1,e:d0:af:3b:93:af Lease:0x658222da}
	I1218 15:43:19.406868    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:7e:bb:25:ab:d4:54 ID:1,7e:bb:25:ab:d4:54 Lease:0x658222cd}
	I1218 15:43:19.406885    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:96:7b:cc:a9:f5:5a ID:1,96:7b:cc:a9:f5:5a Lease:0x6580d150}
	I1218 15:43:19.406892    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:6e:2c:bf:e5:d4:c8 ID:1,6e:2c:bf:e5:d4:c8 Lease:0x6580d142}
	I1218 15:43:19.406900    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:36:d8:32:a:3c:31 ID:1,36:d8:32:a:3c:31 Lease:0x65822280}
	I1218 15:43:19.406909    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:62:20:33:b0:56:e5 ID:1,62:20:33:b0:56:e5 Lease:0x65822268}
	I1218 15:43:19.406916    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:6e:47:1e:5f:3e:2 ID:1,6e:47:1e:5f:3e:2 Lease:0x658221f9}
	I1218 15:43:19.406922    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ae:34:98:ff:97:8d ID:1,ae:34:98:ff:97:8d Lease:0x6582218f}
	I1218 15:43:19.406930    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:ce:68:21:fe:24:bd ID:1,ce:68:21:fe:24:bd Lease:0x65822141}
	I1218 15:43:19.406937    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:6e:45:2:96:10:89 ID:1,6e:45:2:96:10:89 Lease:0x658220ac}
	I1218 15:43:19.406948    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:90:9b:fb:6a:2d ID:1,66:90:9b:fb:6a:2d Lease:0x6580ceb2}
	I1218 15:43:19.406955    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:e6:18:c3:b6:10:29 ID:1,e6:18:c3:b6:10:29 Lease:0x65822080}
	I1218 15:43:19.406963    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:7e:6c:62:57:be:53 ID:1,7e:6c:62:57:be:53 Lease:0x6582204c}
	I1218 15:43:19.406973    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:a:e:6c:4:f9:97 ID:1,a:e:6c:4:f9:97 Lease:0x6580cd3c}
	I1218 15:43:19.406981    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:c2:cb:5b:6d:94:c5 ID:1,c2:cb:5b:6d:94:c5 Lease:0x6580cd26}
	I1218 15:43:19.406990    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:fa:44:a:25:8e:6b ID:1,fa:44:a:25:8e:6b Lease:0x65821e5d}
	I1218 15:43:19.406996    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:1a:fd:bf:11:b7:85 ID:1,1a:fd:bf:11:b7:85 Lease:0x65821e36}
	I1218 15:43:19.407003    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:26:1d:41:79:69:22 ID:1,26:1d:41:79:69:22 Lease:0x65821dfa}
	I1218 15:43:19.407013    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:32:57:7:64:35:d8 ID:1,32:57:7:64:35:d8 Lease:0x65821d79}
	I1218 15:43:19.407035    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:16:1a:98:4e:b8:45 ID:1,16:1a:98:4e:b8:45 Lease:0x65821d5d}
	I1218 15:43:19.407051    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:4e:ee:a9:b6:7e:85 ID:1,4e:ee:a9:b6:7e:85 Lease:0x65821c78}
	I1218 15:43:19.407061    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:3a:b1:4e:f8:70:be ID:1,3a:b1:4e:f8:70:be Lease:0x6580caec}
	I1218 15:43:19.407070    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:2e:85:f3:9a:3:e4 ID:1,2e:85:f3:9a:3:e4 Lease:0x65821b3b}
	I1218 15:43:19.407082    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name: IPAddress:192.169.0.2 HWAddress:82:2b:f6:5b:7f:bf ID:1,82:2b:f6:5b:7f:bf Lease:0x658219d1}
	I1218 15:43:21.408252    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Attempt 4
	I1218 15:43:21.408270    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:43:21.408311    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | hyperkit pid from json: 8177
	I1218 15:43:21.409167    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Searching for 96:14:14:3c:3b:a0 in /var/db/dhcpd_leases ...
	I1218 15:43:21.409229    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Found 43 entries in /var/db/dhcpd_leases!
	I1218 15:43:21.409239    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.44 HWAddress:f2:66:e9:8a:57:73 ID:1,f2:66:e9:8a:57:73 Lease:0x6580d90a}
	I1218 15:43:21.409256    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.43 HWAddress:ae:ea:f0:6a:c1:e9 ID:1,ae:ea:f0:6a:c1:e9 Lease:0x6582282a}
	I1218 15:43:21.409269    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.41 HWAddress:5a:bb:86:81:7e:3b ID:1,5a:bb:86:81:7e:3b Lease:0x6580d79d}
	I1218 15:43:21.409280    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.42 HWAddress:e2:61:5d:1b:8b:e ID:1,e2:61:5d:1b:8b:e Lease:0x65822769}
	I1218 15:43:21.409289    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.40 HWAddress:be:ae:7:4a:b1:e7 ID:1,be:ae:7:4a:b1:e7 Lease:0x6582266c}
	I1218 15:43:21.409297    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.39 HWAddress:e:b8:aa:f6:3d:6 ID:1,e:b8:aa:f6:3d:6 Lease:0x6582261f}
	I1218 15:43:21.409310    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.38 HWAddress:ea:97:1e:7c:d6:8 ID:1,ea:97:1e:7c:d6:8 Lease:0x65822611}
	I1218 15:43:21.409334    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.37 HWAddress:e6:c4:e5:78:db:a5 ID:1,e6:c4:e5:78:db:a5 Lease:0x658225cd}
	I1218 15:43:21.409356    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.36 HWAddress:a:27:22:b0:d3:1 ID:1,a:27:22:b0:d3:1 Lease:0x658225c1}
	I1218 15:43:21.409387    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.35 HWAddress:8e:cf:da:63:d8:fe ID:1,8e:cf:da:63:d8:fe Lease:0x65822569}
	I1218 15:43:21.409403    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.34 HWAddress:6a:31:a1:4d:bb:f5 ID:1,6a:31:a1:4d:bb:f5 Lease:0x65822553}
	I1218 15:43:21.409412    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.33 HWAddress:6:c6:98:eb:ea:2c ID:1,6:c6:98:eb:ea:2c Lease:0x6582250a}
	I1218 15:43:21.409422    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.32 HWAddress:3e:79:5c:d9:7a:33 ID:1,3e:79:5c:d9:7a:33 Lease:0x658224d9}
	I1218 15:43:21.409438    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.31 HWAddress:52:2e:d4:42:e7:3c ID:1,52:2e:d4:42:e7:3c Lease:0x6580d37e}
	I1218 15:43:21.409451    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.30 HWAddress:ba:66:f5:b8:78:5d ID:1,ba:66:f5:b8:78:5d Lease:0x6580d339}
	I1218 15:43:21.409459    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.29 HWAddress:da:74:f2:6a:f8:f1 ID:1,da:74:f2:6a:f8:f1 Lease:0x65822482}
	I1218 15:43:21.409472    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.28 HWAddress:46:30:e8:d5:b4:79 ID:1,46:30:e8:d5:b4:79 Lease:0x6582244f}
	I1218 15:43:21.409483    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.27 HWAddress:92:b7:f6:53:5e:35 ID:1,92:b7:f6:53:5e:35 Lease:0x6580d2f7}
	I1218 15:43:21.409500    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.26 HWAddress:76:7f:aa:f1:b6:93 ID:1,76:7f:aa:f1:b6:93 Lease:0x6582230f}
	I1218 15:43:21.409508    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:e:d0:af:3b:93:af ID:1,e:d0:af:3b:93:af Lease:0x658222da}
	I1218 15:43:21.409518    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:7e:bb:25:ab:d4:54 ID:1,7e:bb:25:ab:d4:54 Lease:0x658222cd}
	I1218 15:43:21.409526    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:96:7b:cc:a9:f5:5a ID:1,96:7b:cc:a9:f5:5a Lease:0x6580d150}
	I1218 15:43:21.409534    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:6e:2c:bf:e5:d4:c8 ID:1,6e:2c:bf:e5:d4:c8 Lease:0x6580d142}
	I1218 15:43:21.409564    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:36:d8:32:a:3c:31 ID:1,36:d8:32:a:3c:31 Lease:0x65822280}
	I1218 15:43:21.409581    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:62:20:33:b0:56:e5 ID:1,62:20:33:b0:56:e5 Lease:0x65822268}
	I1218 15:43:21.409593    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:6e:47:1e:5f:3e:2 ID:1,6e:47:1e:5f:3e:2 Lease:0x658221f9}
	I1218 15:43:21.409605    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ae:34:98:ff:97:8d ID:1,ae:34:98:ff:97:8d Lease:0x6582218f}
	I1218 15:43:21.409622    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:ce:68:21:fe:24:bd ID:1,ce:68:21:fe:24:bd Lease:0x65822141}
	I1218 15:43:21.409648    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:6e:45:2:96:10:89 ID:1,6e:45:2:96:10:89 Lease:0x658220ac}
	I1218 15:43:21.409665    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:90:9b:fb:6a:2d ID:1,66:90:9b:fb:6a:2d Lease:0x6580ceb2}
	I1218 15:43:21.409682    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:e6:18:c3:b6:10:29 ID:1,e6:18:c3:b6:10:29 Lease:0x65822080}
	I1218 15:43:21.409699    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:7e:6c:62:57:be:53 ID:1,7e:6c:62:57:be:53 Lease:0x6582204c}
	I1218 15:43:21.409713    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:a:e:6c:4:f9:97 ID:1,a:e:6c:4:f9:97 Lease:0x6580cd3c}
	I1218 15:43:21.409729    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:c2:cb:5b:6d:94:c5 ID:1,c2:cb:5b:6d:94:c5 Lease:0x6580cd26}
	I1218 15:43:21.409745    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:fa:44:a:25:8e:6b ID:1,fa:44:a:25:8e:6b Lease:0x65821e5d}
	I1218 15:43:21.409754    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:1a:fd:bf:11:b7:85 ID:1,1a:fd:bf:11:b7:85 Lease:0x65821e36}
	I1218 15:43:21.409761    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:26:1d:41:79:69:22 ID:1,26:1d:41:79:69:22 Lease:0x65821dfa}
	I1218 15:43:21.409769    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:32:57:7:64:35:d8 ID:1,32:57:7:64:35:d8 Lease:0x65821d79}
	I1218 15:43:21.409779    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:16:1a:98:4e:b8:45 ID:1,16:1a:98:4e:b8:45 Lease:0x65821d5d}
	I1218 15:43:21.409788    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:4e:ee:a9:b6:7e:85 ID:1,4e:ee:a9:b6:7e:85 Lease:0x65821c78}
	I1218 15:43:21.409796    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:3a:b1:4e:f8:70:be ID:1,3a:b1:4e:f8:70:be Lease:0x6580caec}
	I1218 15:43:21.409807    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:2e:85:f3:9a:3:e4 ID:1,2e:85:f3:9a:3:e4 Lease:0x65821b3b}
	I1218 15:43:21.409817    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name: IPAddress:192.169.0.2 HWAddress:82:2b:f6:5b:7f:bf ID:1,82:2b:f6:5b:7f:bf Lease:0x658219d1}
	I1218 15:43:23.411781    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Attempt 5
	I1218 15:43:23.411807    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:43:23.411963    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | hyperkit pid from json: 8177
	I1218 15:43:23.413456    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Searching for 96:14:14:3c:3b:a0 in /var/db/dhcpd_leases ...
	I1218 15:43:23.413562    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Found 44 entries in /var/db/dhcpd_leases!
	I1218 15:43:23.413585    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.45 HWAddress:96:14:14:3c:3b:a0 ID:1,96:14:14:3c:3b:a0 Lease:0x65822a99}
	I1218 15:43:23.413617    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Found match: 96:14:14:3c:3b:a0
	I1218 15:43:23.413633    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | IP: 192.169.0.45
	I1218 15:43:23.413706    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .GetConfigRaw
	I1218 15:43:23.414438    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .DriverName
	I1218 15:43:23.414592    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .DriverName
	I1218 15:43:23.414736    8029 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I1218 15:43:23.414750    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .GetState
	I1218 15:43:23.414881    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:43:23.414962    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | hyperkit pid from json: 8177
	I1218 15:43:23.415976    8029 main.go:141] libmachine: Detecting operating system of created instance...
	I1218 15:43:23.415989    8029 main.go:141] libmachine: Waiting for SSH to be available...
	I1218 15:43:23.415996    8029 main.go:141] libmachine: Getting to WaitForSSH function...
	I1218 15:43:23.416003    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .GetSSHHostname
	I1218 15:43:23.416130    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .GetSSHPort
	I1218 15:43:23.416257    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .GetSSHKeyPath
	I1218 15:43:23.416371    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .GetSSHKeyPath
	I1218 15:43:23.416484    8029 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .GetSSHUsername
	I1218 15:43:23.416637    8029 main.go:141] libmachine: Using SSH client type: native
	I1218 15:43:23.416954    8029 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1406660] 0x1409340 <nil>  [] 0s} 192.169.0.45 22 <nil> <nil>}
	I1218 15:43:23.416962    8029 main.go:141] libmachine: About to run SSH command:
	exit 0
	I1218 15:43:44.439345    8029 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.44:22: connect: operation timed out
	I1218 15:44:38.418716    8029 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.45:22: connect: operation timed out
	I1218 15:45:02.441183    8029 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.44:22: connect: operation timed out
	I1218 15:45:56.421116    8029 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.45:22: connect: operation timed out
	I1218 15:46:20.442765    8029 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.44:22: connect: operation timed out
	I1218 15:47:14.424850    8029 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.45:22: connect: operation timed out
	I1218 15:47:38.446115    8029 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.44:22: connect: operation timed out
	I1218 15:48:32.455610    8029 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.45:22: connect: operation timed out
	I1218 15:48:56.483541    8029 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.44:22: connect: operation timed out
	I1218 15:49:12.847964    8029 start.go:128] duration metric: createHost completed in 6m0.034690615s
	I1218 15:49:12.848002    8029 start.go:83] releasing machines lock for "default-k8s-diff-port-748000", held for 6m0.034874075s
	W1218 15:49:12.848100    8029 out.go:239] * Failed to start hyperkit VM. Running "minikube delete -p default-k8s-diff-port-748000" may fix it: creating host: create host timed out in 360.000000 seconds
	* Failed to start hyperkit VM. Running "minikube delete -p default-k8s-diff-port-748000" may fix it: creating host: create host timed out in 360.000000 seconds
	I1218 15:49:12.869868    8029 out.go:177] 
	W1218 15:49:12.891325    8029 out.go:239] X Exiting due to DRV_CREATE_TIMEOUT: Failed to start host: creating host: create host timed out in 360.000000 seconds
	X Exiting due to DRV_CREATE_TIMEOUT: Failed to start host: creating host: create host timed out in 360.000000 seconds
	W1218 15:49:12.891374    8029 out.go:239] * Suggestion: Try 'minikube delete', and disable any conflicting VPN or firewall software
	* Suggestion: Try 'minikube delete', and disable any conflicting VPN or firewall software
	W1218 15:49:12.891397    8029 out.go:239] * Related issue: https://github.com/kubernetes/minikube/issues/7072
	* Related issue: https://github.com/kubernetes/minikube/issues/7072
	I1218 15:49:12.913412    8029 out.go:177] 

                                                
                                                
** /stderr **
start_stop_delete_test.go:188: failed starting minikube -first start-. args "out/minikube-darwin-amd64 start -p default-k8s-diff-port-748000 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.28.4": exit status 52
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-diff-port-748000 -n default-k8s-diff-port-748000
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-diff-port-748000 -n default-k8s-diff-port-748000: exit status 3 (1m15.090360523s)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E1218 15:50:28.088610    8291 status.go:376] failed to get storage capacity of /var: NewSession: new client: new client: dial tcp 192.169.0.45:22: connect: operation timed out
	E1218 15:50:28.088623    8291 status.go:249] status error: NewSession: new client: new client: dial tcp 192.169.0.45:22: connect: operation timed out

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 3 (may be ok)
helpers_test.go:241: "default-k8s-diff-port-748000" host is not running, skipping log retrieval (state="Error")
--- FAIL: TestStartStop/group/default-k8s-diff-port/serial/FirstStart (802.04s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop (690.19s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
E1218 15:41:22.286897    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/no-preload-994000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
E1218 15:41:34.426213    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/auto-302000/client.crt: no such file or directory
E1218 15:41:49.977621    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/no-preload-994000/client.crt: no such file or directory
E1218 15:41:50.056119    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kindnet-302000/client.crt: no such file or directory
E1218 15:41:51.733182    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kubenet-302000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
E1218 15:42:02.326925    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/calico-302000/client.crt: no such file or directory
E1218 15:42:02.848566    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/custom-flannel-302000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
helpers_test.go:329: TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
E1218 15:42:57.883822    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
E1218 15:43:07.598280    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/old-k8s-version-920000/client.crt: no such file or directory
E1218 15:43:22.194251    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
E1218 15:43:22.760287    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/false-302000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
E1218 15:43:25.432691    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/calico-302000/client.crt: no such file or directory
E1218 15:43:25.898835    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/custom-flannel-302000/client.crt: no such file or directory
E1218 15:43:35.290208    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/old-k8s-version-920000/client.crt: no such file or directory
E1218 15:43:36.114718    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/enable-default-cni-302000/client.crt: no such file or directory
E1218 15:43:39.140809    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
helpers_test.go:329: TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
E1218 15:44:45.811845    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/false-302000/client.crt: no such file or directory
E1218 15:44:51.464810    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/flannel-302000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
E1218 15:44:55.603858    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/bridge-302000/client.crt: no such file or directory
E1218 15:44:57.377943    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
E1218 15:44:59.201877    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/enable-default-cni-302000/client.crt: no such file or directory
E1218 15:45:11.374354    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/auto-302000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
E1218 15:45:27.009500    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kindnet-302000/client.crt: no such file or directory
E1218 15:45:28.810125    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
E1218 15:46:00.939179    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
E1218 15:46:14.514618    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/flannel-302000/client.crt: no such file or directory
E1218 15:46:18.658038    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/bridge-302000/client.crt: no such file or directory
E1218 15:46:22.292390    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/no-preload-994000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
E1218 15:46:51.739094    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kubenet-302000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
E1218 15:47:02.331557    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/calico-302000/client.crt: no such file or directory
E1218 15:47:02.853307    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/custom-flannel-302000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
helpers_test.go:329: TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
E1218 15:47:57.886525    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
E1218 15:48:07.603115    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/old-k8s-version-920000/client.crt: no such file or directory
E1218 15:48:14.792039    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kubenet-302000/client.crt: no such file or directory
E1218 15:48:22.783261    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/false-302000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
E1218 15:48:36.149104    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/enable-default-cni-302000/client.crt: no such file or directory
E1218 15:48:39.177578    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
helpers_test.go:329: TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
E1218 15:49:51.506354    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/flannel-302000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": context deadline exceeded
start_stop_delete_test.go:274: ***** TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: pod "k8s-app=kubernetes-dashboard" failed to start within 9m0s: context deadline exceeded ****
start_stop_delete_test.go:274: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p embed-certs-732000 -n embed-certs-732000
E1218 15:49:55.645404    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/bridge-302000/client.crt: no such file or directory
E1218 15:49:57.418456    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
E1218 15:50:11.415530    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/auto-302000/client.crt: no such file or directory
E1218 15:50:27.050354    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kindnet-302000/client.crt: no such file or directory
start_stop_delete_test.go:274: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p embed-certs-732000 -n embed-certs-732000: exit status 3 (1m15.09410037s)

                                                
                                                
-- stdout --
	Nonexistent

                                                
                                                
-- /stdout --
** stderr ** 
	E1218 15:51:09.576603    8307 status.go:376] failed to get storage capacity of /var: NewSession: new client: new client: dial tcp 192.169.0.43:22: connect: operation timed out
	E1218 15:51:09.576632    8307 status.go:249] status error: NewSession: new client: new client: dial tcp 192.169.0.43:22: connect: operation timed out

                                                
                                                
** /stderr **
start_stop_delete_test.go:274: status error: exit status 3 (may be ok)
start_stop_delete_test.go:274: "embed-certs-732000" apiserver is not running, skipping kubectl commands (state="Nonexistent")
start_stop_delete_test.go:275: failed waiting for 'addon dashboard' pod post-stop-start: k8s-app=kubernetes-dashboard within 9m0s: context deadline exceeded
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p embed-certs-732000 -n embed-certs-732000
E1218 15:51:20.470618    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
E1218 15:51:22.332255    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/no-preload-994000/client.crt: no such file or directory
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p embed-certs-732000 -n embed-certs-732000: exit status 3 (1m15.092979491s)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E1218 15:52:24.670364    8340 status.go:376] failed to get storage capacity of /var: NewSession: new client: new client: dial tcp 192.169.0.43:22: connect: operation timed out
	E1218 15:52:24.670389    8340 status.go:249] status error: NewSession: new client: new client: dial tcp 192.169.0.43:22: connect: operation timed out

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 3 (may be ok)
helpers_test.go:241: "embed-certs-732000" host is not running, skipping log retrieval (state="Error")
--- FAIL: TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop (690.19s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/DeployApp (150.22s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context default-k8s-diff-port-748000 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) Non-zero exit: kubectl --context default-k8s-diff-port-748000 create -f testdata/busybox.yaml: exit status 1 (36.338516ms)

                                                
                                                
** stderr ** 
	error: no openapi getter

                                                
                                                
** /stderr **
start_stop_delete_test.go:196: kubectl --context default-k8s-diff-port-748000 create -f testdata/busybox.yaml failed: exit status 1
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-diff-port-748000 -n default-k8s-diff-port-748000
E1218 15:50:28.852159    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-diff-port-748000 -n default-k8s-diff-port-748000: exit status 3 (1m15.090002337s)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E1218 15:51:43.214885    8327 status.go:376] failed to get storage capacity of /var: NewSession: new client: new client: dial tcp 192.169.0.45:22: connect: operation timed out
	E1218 15:51:43.214907    8327 status.go:249] status error: NewSession: new client: new client: dial tcp 192.169.0.45:22: connect: operation timed out

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 3 (may be ok)
helpers_test.go:241: "default-k8s-diff-port-748000" host is not running, skipping log retrieval (state="Error")
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-diff-port-748000 -n default-k8s-diff-port-748000
E1218 15:51:51.779518    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kubenet-302000/client.crt: no such file or directory
E1218 15:52:02.371340    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/calico-302000/client.crt: no such file or directory
E1218 15:52:02.894911    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/custom-flannel-302000/client.crt: no such file or directory
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-diff-port-748000 -n default-k8s-diff-port-748000: exit status 3 (1m15.090065309s)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E1218 15:52:58.306461    8356 status.go:376] failed to get storage capacity of /var: NewSession: new client: new client: dial tcp 192.169.0.45:22: connect: operation timed out
	E1218 15:52:58.306485    8356 status.go:249] status error: NewSession: new client: new client: dial tcp 192.169.0.45:22: connect: operation timed out

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 3 (may be ok)
helpers_test.go:241: "default-k8s-diff-port-748000" host is not running, skipping log retrieval (state="Error")
--- FAIL: TestStartStop/group/default-k8s-diff-port/serial/DeployApp (150.22s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/AddonExistsAfterStop (690.18s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
E1218 15:52:45.383714    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/no-preload-994000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
E1218 15:52:57.928644    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
E1218 15:53:36.159042    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/enable-default-cni-302000/client.crt: no such file or directory
E1218 15:53:39.184593    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
helpers_test.go:329: TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
E1218 15:54:30.695679    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/old-k8s-version-920000/client.crt: no such file or directory
E1218 15:54:51.508760    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/flannel-302000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
E1218 15:54:55.648206    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/bridge-302000/client.crt: no such file or directory
E1218 15:54:57.419866    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
E1218 15:55:11.418078    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/auto-302000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
E1218 15:55:27.052893    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kindnet-302000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
E1218 15:56:22.335509    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/no-preload-994000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
helpers_test.go:329: TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
E1218 15:57:02.373833    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/calico-302000/client.crt: no such file or directory
E1218 15:57:02.897266    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/custom-flannel-302000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
helpers_test.go:329: TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
E1218 15:57:57.929059    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
E1218 15:58:07.646985    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/old-k8s-version-920000/client.crt: no such file or directory
E1218 15:58:14.475877    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/auto-302000/client.crt: no such file or directory
E1218 15:58:22.808634    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/false-302000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
E1218 15:58:30.105580    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kindnet-302000/client.crt: no such file or directory
E1218 15:58:36.161753    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/enable-default-cni-302000/client.crt: no such file or directory
E1218 15:58:39.188881    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
helpers_test.go:329: TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
E1218 15:59:51.512446    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/flannel-302000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
E1218 15:59:55.649942    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/bridge-302000/client.crt: no such file or directory
E1218 15:59:57.424138    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
E1218 16:00:02.243317    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
E1218 16:00:05.480664    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/calico-302000/client.crt: no such file or directory
E1218 16:00:05.948908    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/custom-flannel-302000/client.crt: no such file or directory
E1218 16:00:11.420692    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/auto-302000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
E1218 16:00:27.055658    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kindnet-302000/client.crt: no such file or directory
E1218 16:00:28.855258    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.169.0.43:8443: i/o timeout
E1218 16:01:22.339101    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/no-preload-994000/client.crt: no such file or directory
helpers_test.go:329: TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.169.0.43:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": context deadline exceeded
start_stop_delete_test.go:287: ***** TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: pod "k8s-app=kubernetes-dashboard" failed to start within 9m0s: context deadline exceeded ****
start_stop_delete_test.go:287: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p embed-certs-732000 -n embed-certs-732000
E1218 16:01:25.861339    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/false-302000/client.crt: no such file or directory
E1218 16:01:39.249545    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/enable-default-cni-302000/client.crt: no such file or directory
E1218 16:01:51.782477    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kubenet-302000/client.crt: no such file or directory
E1218 16:02:02.375538    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/calico-302000/client.crt: no such file or directory
E1218 16:02:02.898727    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/custom-flannel-302000/client.crt: no such file or directory
start_stop_delete_test.go:287: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p embed-certs-732000 -n embed-certs-732000: exit status 3 (1m15.091622069s)

                                                
                                                
-- stdout --
	Nonexistent

                                                
                                                
-- /stdout --
** stderr ** 
	E1218 16:02:39.768709    8576 status.go:376] failed to get storage capacity of /var: NewSession: new client: new client: dial tcp 192.169.0.43:22: connect: operation timed out
	E1218 16:02:39.768739    8576 status.go:249] status error: NewSession: new client: new client: dial tcp 192.169.0.43:22: connect: operation timed out

                                                
                                                
** /stderr **
start_stop_delete_test.go:287: status error: exit status 3 (may be ok)
start_stop_delete_test.go:287: "embed-certs-732000" apiserver is not running, skipping kubectl commands (state="Nonexistent")
start_stop_delete_test.go:288: failed waiting for 'addon dashboard' pod post-stop-start: k8s-app=kubernetes-dashboard within 9m0s: context deadline exceeded
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context embed-certs-732000 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
start_stop_delete_test.go:291: (dbg) Non-zero exit: kubectl --context embed-certs-732000 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard: context deadline exceeded (1.078µs)
start_stop_delete_test.go:293: failed to get info on kubernetes-dashboard deployments. args "kubectl --context embed-certs-732000 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard": context deadline exceeded
start_stop_delete_test.go:297: addon did not load correct image. Expected to contain " registry.k8s.io/echoserver:1.4". Addon deployment info: 
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p embed-certs-732000 -n embed-certs-732000
E1218 16:02:40.986820    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
E1218 16:02:54.562836    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/flannel-302000/client.crt: no such file or directory
E1218 16:02:57.933488    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
E1218 16:02:58.705291    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/bridge-302000/client.crt: no such file or directory
E1218 16:03:07.649713    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/old-k8s-version-920000/client.crt: no such file or directory
E1218 16:03:22.811089    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/false-302000/client.crt: no such file or directory
E1218 16:03:36.165200    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/enable-default-cni-302000/client.crt: no such file or directory
E1218 16:03:39.189462    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p embed-certs-732000 -n embed-certs-732000: exit status 3 (1m15.088636863s)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E1218 16:03:54.858246    8600 status.go:376] failed to get storage capacity of /var: NewSession: new client: new client: dial tcp 192.169.0.43:22: connect: operation timed out
	E1218 16:03:54.858270    8600 status.go:249] status error: NewSession: new client: new client: dial tcp 192.169.0.43:22: connect: operation timed out

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 3 (may be ok)
helpers_test.go:241: "embed-certs-732000" host is not running, skipping log retrieval (state="Error")
--- FAIL: TestStartStop/group/embed-certs/serial/AddonExistsAfterStop (690.18s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/EnableAddonWhileActive (225.3s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p default-k8s-diff-port-748000 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain
E1218 15:53:07.643441    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/old-k8s-version-920000/client.crt: no such file or directory
E1218 15:53:22.804188    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/false-302000/client.crt: no such file or directory
start_stop_delete_test.go:205: (dbg) Non-zero exit: out/minikube-darwin-amd64 addons enable metrics-server -p default-k8s-diff-port-748000 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain: exit status 11 (2m30.170735571s)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to MK_ADDON_ENABLE_PAUSED: enabled failed: check paused: list paused: docker: NewSession: new client: new client: dial tcp 192.169.0.45:22: connect: operation timed out
	* 
	╭───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                                                           │
	│    * If the above advice does not help, please let us know:                                                               │
	│      https://github.com/kubernetes/minikube/issues/new/choose                                                             │
	│                                                                                                                           │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.                                  │
	│    * Please also attach the following file to the GitHub issue:                                                           │
	│    * - /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube_addons_2bafae6fa40fec163538f94366e390b0317a8b15_0.log    │
	│                                                                                                                           │
	╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
start_stop_delete_test.go:207: failed to enable an addon post-stop. args "out/minikube-darwin-amd64 addons enable metrics-server -p default-k8s-diff-port-748000 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain": exit status 11
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context default-k8s-diff-port-748000 describe deploy/metrics-server -n kube-system
start_stop_delete_test.go:215: (dbg) Non-zero exit: kubectl --context default-k8s-diff-port-748000 describe deploy/metrics-server -n kube-system: exit status 1 (36.781929ms)

                                                
                                                
** stderr ** 
	error: context "default-k8s-diff-port-748000" does not exist

                                                
                                                
** /stderr **
start_stop_delete_test.go:217: failed to get info on auto-pause deployments. args "kubectl --context default-k8s-diff-port-748000 describe deploy/metrics-server -n kube-system": exit status 1
start_stop_delete_test.go:221: addon did not load correct image. Expected to contain " fake.domain/registry.k8s.io/echoserver:1.4". Addon deployment info: 
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-diff-port-748000 -n default-k8s-diff-port-748000
E1218 15:55:28.853120    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-diff-port-748000 -n default-k8s-diff-port-748000: exit status 3 (1m15.089722794s)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E1218 15:56:43.606805    8431 status.go:376] failed to get storage capacity of /var: NewSession: new client: new client: dial tcp 192.169.0.45:22: connect: operation timed out
	E1218 15:56:43.606822    8431 status.go:249] status error: NewSession: new client: new client: dial tcp 192.169.0.45:22: connect: operation timed out

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 3 (may be ok)
helpers_test.go:241: "default-k8s-diff-port-748000" host is not running, skipping log retrieval (state="Error")
--- FAIL: TestStartStop/group/default-k8s-diff-port/serial/EnableAddonWhileActive (225.30s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/SecondStart (695.95s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p default-k8s-diff-port-748000 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.28.4
E1218 15:56:51.780774    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kubenet-302000/client.crt: no such file or directory
E1218 15:56:51.907667    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
start_stop_delete_test.go:256: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p default-k8s-diff-port-748000 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.28.4: signal: killed (10m20.855210491s)

                                                
                                                
-- stdout --
	* [default-k8s-diff-port-748000] minikube v1.32.0 on Darwin 14.2
	  - MINIKUBE_LOCATION=17822
	  - KUBECONFIG=/Users/jenkins/minikube-integration/17822-999/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17822-999/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on existing profile
	* Starting control plane node default-k8s-diff-port-748000 in cluster default-k8s-diff-port-748000
	* Restarting existing hyperkit VM for "default-k8s-diff-port-748000" ...

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 15:56:45.253484    8471 out.go:296] Setting OutFile to fd 1 ...
	I1218 15:56:45.253694    8471 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1218 15:56:45.253700    8471 out.go:309] Setting ErrFile to fd 2...
	I1218 15:56:45.253704    8471 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1218 15:56:45.253903    8471 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17822-999/.minikube/bin
	I1218 15:56:45.255284    8471 out.go:303] Setting JSON to false
	I1218 15:56:45.277737    8471 start.go:128] hostinfo: {"hostname":"MacOS-Agent-2.local","uptime":5176,"bootTime":1702938629,"procs":433,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.2","kernelVersion":"23.2.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"2965c349-98a5-5970-aaa9-9eedd3ae5959"}
	W1218 15:56:45.277849    8471 start.go:136] gopshost.Virtualization returned error: not implemented yet
	I1218 15:56:45.299656    8471 out.go:177] * [default-k8s-diff-port-748000] minikube v1.32.0 on Darwin 14.2
	I1218 15:56:45.342399    8471 out.go:177]   - MINIKUBE_LOCATION=17822
	I1218 15:56:45.342531    8471 notify.go:220] Checking for updates...
	I1218 15:56:45.385262    8471 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/17822-999/kubeconfig
	I1218 15:56:45.406253    8471 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I1218 15:56:45.427098    8471 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 15:56:45.448325    8471 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17822-999/.minikube
	I1218 15:56:45.469370    8471 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 15:56:45.493157    8471 config.go:182] Loaded profile config "default-k8s-diff-port-748000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
	I1218 15:56:45.493842    8471 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 15:56:45.493895    8471 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 15:56:45.502920    8471 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:56827
	I1218 15:56:45.503644    8471 main.go:141] libmachine: () Calling .GetVersion
	I1218 15:56:45.504063    8471 main.go:141] libmachine: Using API Version  1
	I1218 15:56:45.504076    8471 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 15:56:45.504318    8471 main.go:141] libmachine: () Calling .GetMachineName
	I1218 15:56:45.504428    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .DriverName
	I1218 15:56:45.504621    8471 driver.go:392] Setting default libvirt URI to qemu:///system
	I1218 15:56:45.504859    8471 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 15:56:45.504881    8471 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 15:56:45.512809    8471 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:56829
	I1218 15:56:45.513363    8471 main.go:141] libmachine: () Calling .GetVersion
	I1218 15:56:45.513730    8471 main.go:141] libmachine: Using API Version  1
	I1218 15:56:45.513750    8471 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 15:56:45.513944    8471 main.go:141] libmachine: () Calling .GetMachineName
	I1218 15:56:45.514049    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .DriverName
	I1218 15:56:45.542266    8471 out.go:177] * Using the hyperkit driver based on existing profile
	I1218 15:56:45.584219    8471 start.go:298] selected driver: hyperkit
	I1218 15:56:45.584247    8471 start.go:902] validating driver "hyperkit" against &{Name:default-k8s-diff-port-748000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/17806/minikube-v1.32.1-1702708929-17806-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1702920864-17822@sha256:4842b362f06b33d847d73f7ed166c93ce608f4c4cea49b711c7055fd50ebd1e0 Memory:2200 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:
22 KubernetesConfig:{KubernetesVersion:v1.28.4 ClusterName:default-k8s-diff-port-748000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8444 NodeName:} Nodes:[{Name: IP: Port:8444 KubernetesVersion:v1.28.4 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[dashboard:true] CustomAddonImages:map[MetricsScraper:registry.k8s.io/echoserver:1.4] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/User
s:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 AutoPauseInterval:1m0s GPUs:}
	I1218 15:56:45.584471    8471 start.go:913] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 15:56:45.588904    8471 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 15:56:45.589002    8471 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/17822-999/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I1218 15:56:45.596977    8471 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.32.0
	I1218 15:56:45.601027    8471 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 15:56:45.601051    8471 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I1218 15:56:45.601189    8471 start_flags.go:931] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1218 15:56:45.601255    8471 cni.go:84] Creating CNI manager for ""
	I1218 15:56:45.601266    8471 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I1218 15:56:45.601282    8471 start_flags.go:323] config:
	{Name:default-k8s-diff-port-748000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/17806/minikube-v1.32.1-1702708929-17806-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1702920864-17822@sha256:4842b362f06b33d847d73f7ed166c93ce608f4c4cea49b711c7055fd50ebd1e0 Memory:2200 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.4 ClusterName:default-k8s-diff-port-7
48000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8444 NodeName:} Nodes:[{Name: IP: Port:8444 KubernetesVersion:v1.28.4 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[dashboard:true] CustomAddonImages:map[MetricsScraper:registry.k8s.io/echoserver:1.4] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:26214
4 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 AutoPauseInterval:1m0s GPUs:}
	I1218 15:56:45.601425    8471 iso.go:125] acquiring lock: {Name:mk6c2133f2dd3312b15d4fc195383881e10096e9 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 15:56:45.643207    8471 out.go:177] * Starting control plane node default-k8s-diff-port-748000 in cluster default-k8s-diff-port-748000
	I1218 15:56:45.664065    8471 preload.go:132] Checking if preload exists for k8s version v1.28.4 and runtime docker
	I1218 15:56:45.664134    8471 preload.go:148] Found local preload: /Users/jenkins/minikube-integration/17822-999/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-docker-overlay2-amd64.tar.lz4
	I1218 15:56:45.664166    8471 cache.go:56] Caching tarball of preloaded images
	I1218 15:56:45.664358    8471 preload.go:174] Found /Users/jenkins/minikube-integration/17822-999/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I1218 15:56:45.664387    8471 cache.go:59] Finished verifying existence of preloaded tar for  v1.28.4 on docker
	I1218 15:56:45.664570    8471 profile.go:148] Saving config to /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/default-k8s-diff-port-748000/config.json ...
	I1218 15:56:45.665279    8471 start.go:365] acquiring machines lock for default-k8s-diff-port-748000: {Name:mk129da0b7e14236047c6f70b7fc622a9cc1d994 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I1218 15:56:45.665362    8471 start.go:369] acquired machines lock for "default-k8s-diff-port-748000" in 65.913µs
	I1218 15:56:45.665386    8471 start.go:96] Skipping create...Using existing machine configuration
	I1218 15:56:45.665399    8471 fix.go:54] fixHost starting: 
	I1218 15:56:45.665734    8471 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 15:56:45.665766    8471 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 15:56:45.674221    8471 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:56831
	I1218 15:56:45.674765    8471 main.go:141] libmachine: () Calling .GetVersion
	I1218 15:56:45.675106    8471 main.go:141] libmachine: Using API Version  1
	I1218 15:56:45.675117    8471 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 15:56:45.675334    8471 main.go:141] libmachine: () Calling .GetMachineName
	I1218 15:56:45.675451    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .DriverName
	I1218 15:56:45.675540    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .GetState
	I1218 15:56:45.675641    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:56:45.675697    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | hyperkit pid from json: 8177
	I1218 15:56:45.676632    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | hyperkit pid 8177 missing from process table
	I1218 15:56:45.676670    8471 fix.go:102] recreateIfNeeded on default-k8s-diff-port-748000: state=Stopped err=<nil>
	I1218 15:56:45.676691    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .DriverName
	W1218 15:56:45.676790    8471 fix.go:128] unexpected machine state, will restart: <nil>
	I1218 15:56:45.719351    8471 out.go:177] * Restarting existing hyperkit VM for "default-k8s-diff-port-748000" ...
	I1218 15:56:45.742065    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .Start
	I1218 15:56:45.742366    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:56:45.742403    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/hyperkit.pid
	I1218 15:56:45.742488    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Using UUID 35162ef8-9dff-11ee-9910-f01898ef957c
	I1218 15:56:45.769514    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Generated MAC 96:14:14:3c:3b:a0
	I1218 15:56:45.769538    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=default-k8s-diff-port-748000
	I1218 15:56:45.769703    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:56:45 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"35162ef8-9dff-11ee-9910-f01898ef957c", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc000351b60)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/bzimage", Initrd:"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0,
Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I1218 15:56:45.769734    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:56:45 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"35162ef8-9dff-11ee-9910-f01898ef957c", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc000351b60)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/bzimage", Initrd:"/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0,
Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I1218 15:56:45.769805    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:56:45 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "35162ef8-9dff-11ee-9910-f01898ef957c", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/default-k8s-diff-port-748000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/tty,log=/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/17822-999/.minik
ube/machines/default-k8s-diff-port-748000/bzimage,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=default-k8s-diff-port-748000"}
	I1218 15:56:45.769844    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:56:45 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 35162ef8-9dff-11ee-9910-f01898ef957c -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/default-k8s-diff-port-748000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/tty,log=/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/console-ring -f kexec,/Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/bzimage,/Users/jenki
ns/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=default-k8s-diff-port-748000"
	I1218 15:56:45.769857    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:56:45 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I1218 15:56:45.771184    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:56:45 DEBUG: hyperkit: Pid is 8482
	I1218 15:56:45.771595    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Attempt 0
	I1218 15:56:45.771613    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 15:56:45.771657    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | hyperkit pid from json: 8482
	I1218 15:56:45.773752    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Searching for 96:14:14:3c:3b:a0 in /var/db/dhcpd_leases ...
	I1218 15:56:45.773873    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Found 44 entries in /var/db/dhcpd_leases!
	I1218 15:56:45.773890    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.45 HWAddress:96:14:14:3c:3b:a0 ID:1,96:14:14:3c:3b:a0 Lease:0x6580dc3b}
	I1218 15:56:45.773897    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | Found match: 96:14:14:3c:3b:a0
	I1218 15:56:45.773903    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | IP: 192.169.0.45
	I1218 15:56:45.773986    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .GetConfigRaw
	I1218 15:56:45.774690    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .GetIP
	I1218 15:56:45.774871    8471 profile.go:148] Saving config to /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/default-k8s-diff-port-748000/config.json ...
	I1218 15:56:45.775293    8471 machine.go:88] provisioning docker machine ...
	I1218 15:56:45.775304    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .DriverName
	I1218 15:56:45.775433    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .GetMachineName
	I1218 15:56:45.775551    8471 buildroot.go:166] provisioning hostname "default-k8s-diff-port-748000"
	I1218 15:56:45.775561    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .GetMachineName
	I1218 15:56:45.775673    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .GetSSHHostname
	I1218 15:56:45.775757    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .GetSSHPort
	I1218 15:56:45.775849    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .GetSSHKeyPath
	I1218 15:56:45.775962    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .GetSSHKeyPath
	I1218 15:56:45.776051    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) Calling .GetSSHUsername
	I1218 15:56:45.776175    8471 main.go:141] libmachine: Using SSH client type: native
	I1218 15:56:45.776482    8471 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1406660] 0x1409340 <nil>  [] 0s} 192.169.0.45 22 <nil> <nil>}
	I1218 15:56:45.776499    8471 main.go:141] libmachine: About to run SSH command:
	sudo hostname default-k8s-diff-port-748000 && echo "default-k8s-diff-port-748000" | sudo tee /etc/hostname
	I1218 15:56:45.779027    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:56:45 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I1218 15:56:45.787891    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:56:45 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/17822-999/.minikube/machines/default-k8s-diff-port-748000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I1218 15:56:45.788806    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:56:45 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I1218 15:56:45.788830    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:56:45 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I1218 15:56:45.788838    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:56:45 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I1218 15:56:45.788846    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:56:45 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I1218 15:56:46.155781    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:56:46 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I1218 15:56:46.155804    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:56:46 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I1218 15:56:46.259829    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:56:46 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I1218 15:56:46.259852    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:56:46 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I1218 15:56:46.259894    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:56:46 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I1218 15:56:46.259916    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:56:46 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I1218 15:56:46.260743    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:56:46 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I1218 15:56:46.260754    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:56:46 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I1218 15:56:51.168118    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:56:51 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I1218 15:56:51.168202    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:56:51 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I1218 15:56:51.168214    8471 main.go:141] libmachine: (default-k8s-diff-port-748000) DBG | 2023/12/18 15:56:51 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I1218 15:58:00.778516    8471 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.45:22: connect: operation timed out
	I1218 15:59:18.779963    8471 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.45:22: connect: operation timed out
	I1218 16:00:36.782013    8471 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.45:22: connect: operation timed out
	I1218 16:01:54.782454    8471 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.45:22: connect: operation timed out
	I1218 16:03:12.784558    8471 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.45:22: connect: operation timed out
	I1218 16:04:30.785863    8471 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.45:22: connect: operation timed out
	I1218 16:05:48.789623    8471 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.45:22: connect: operation timed out

                                                
                                                
** /stderr **
start_stop_delete_test.go:259: failed to start minikube post-stop. args "out/minikube-darwin-amd64 start -p default-k8s-diff-port-748000 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.28.4": signal: killed
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-diff-port-748000 -n default-k8s-diff-port-748000
E1218 16:07:57.984418    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
E1218 16:08:00.530031    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
E1218 16:08:07.702547    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/old-k8s-version-920000/client.crt: no such file or directory
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-diff-port-748000 -n default-k8s-diff-port-748000: exit status 3 (1m15.093384727s)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E1218 16:08:21.203542    8779 status.go:376] failed to get storage capacity of /var: NewSession: new client: new client: dial tcp 192.169.0.45:22: connect: operation timed out
	E1218 16:08:21.203566    8779 status.go:249] status error: NewSession: new client: new client: dial tcp 192.169.0.45:22: connect: operation timed out

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 3 (may be ok)
helpers_test.go:241: "default-k8s-diff-port-748000" host is not running, skipping log retrieval (state="Error")
--- FAIL: TestStartStop/group/default-k8s-diff-port/serial/SecondStart (695.95s)

                                                
                                    

Test pass (286/318)

Order passed test Duration
3 TestDownloadOnly/v1.16.0/json-events 40.41
4 TestDownloadOnly/v1.16.0/preload-exists 0
7 TestDownloadOnly/v1.16.0/kubectl 0
8 TestDownloadOnly/v1.16.0/LogsDuration 0.31
10 TestDownloadOnly/v1.28.4/json-events 9.15
11 TestDownloadOnly/v1.28.4/preload-exists 0
14 TestDownloadOnly/v1.28.4/kubectl 0
15 TestDownloadOnly/v1.28.4/LogsDuration 0.29
17 TestDownloadOnly/v1.29.0-rc.2/json-events 44.21
18 TestDownloadOnly/v1.29.0-rc.2/preload-exists 0
21 TestDownloadOnly/v1.29.0-rc.2/kubectl 0
22 TestDownloadOnly/v1.29.0-rc.2/LogsDuration 0.33
23 TestDownloadOnly/DeleteAll 0.4
24 TestDownloadOnly/DeleteAlwaysSucceeds 0.37
26 TestBinaryMirror 1.02
27 TestOffline 65.34
30 TestAddons/PreSetup/EnablingAddonOnNonExistingCluster 0.19
31 TestAddons/PreSetup/DisablingAddonOnNonExistingCluster 0.21
32 TestAddons/Setup 139.54
34 TestAddons/parallel/Registry 14.69
35 TestAddons/parallel/Ingress 19.98
36 TestAddons/parallel/InspektorGadget 10.54
37 TestAddons/parallel/MetricsServer 5.48
38 TestAddons/parallel/HelmTiller 11.38
40 TestAddons/parallel/CSI 51.26
41 TestAddons/parallel/Headlamp 12.98
42 TestAddons/parallel/CloudSpanner 5.42
43 TestAddons/parallel/LocalPath 52.48
44 TestAddons/parallel/NvidiaDevicePlugin 5.36
47 TestAddons/serial/GCPAuth/Namespaces 0.28
48 TestAddons/StoppedEnableDisable 5.77
49 TestCertOptions 40.37
50 TestCertExpiration 247.56
51 TestDockerFlags 49.71
52 TestForceSystemdFlag 38.07
53 TestForceSystemdEnv 38.89
56 TestHyperKitDriverInstallOrUpdate 7.56
59 TestErrorSpam/setup 35.82
60 TestErrorSpam/start 1.43
61 TestErrorSpam/status 0.51
62 TestErrorSpam/pause 1.3
63 TestErrorSpam/unpause 1.32
64 TestErrorSpam/stop 3.65
67 TestFunctional/serial/CopySyncFile 0.01
68 TestFunctional/serial/StartWithProxy 51.68
69 TestFunctional/serial/AuditLog 0
70 TestFunctional/serial/SoftStart 40.89
71 TestFunctional/serial/KubeContext 0.04
72 TestFunctional/serial/KubectlGetPods 0.06
75 TestFunctional/serial/CacheCmd/cache/add_remote 3.26
76 TestFunctional/serial/CacheCmd/cache/add_local 1.71
77 TestFunctional/serial/CacheCmd/cache/CacheDelete 0.1
78 TestFunctional/serial/CacheCmd/cache/list 0.08
79 TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node 0.18
80 TestFunctional/serial/CacheCmd/cache/cache_reload 1.13
81 TestFunctional/serial/CacheCmd/cache/delete 0.16
82 TestFunctional/serial/MinikubeKubectlCmd 0.56
83 TestFunctional/serial/MinikubeKubectlCmdDirectly 0.83
84 TestFunctional/serial/ExtraConfig 39.3
85 TestFunctional/serial/ComponentHealth 0.06
86 TestFunctional/serial/LogsCmd 2.88
87 TestFunctional/serial/LogsFileCmd 2.71
88 TestFunctional/serial/InvalidService 4.4
90 TestFunctional/parallel/ConfigCmd 0.53
91 TestFunctional/parallel/DashboardCmd 11.03
92 TestFunctional/parallel/DryRun 0.97
93 TestFunctional/parallel/InternationalLanguage 0.49
94 TestFunctional/parallel/StatusCmd 0.49
98 TestFunctional/parallel/ServiceCmdConnect 7.59
99 TestFunctional/parallel/AddonsCmd 0.26
100 TestFunctional/parallel/PersistentVolumeClaim 27.51
102 TestFunctional/parallel/SSHCmd 0.3
103 TestFunctional/parallel/CpCmd 1.1
104 TestFunctional/parallel/MySQL 26.13
105 TestFunctional/parallel/FileSync 0.22
106 TestFunctional/parallel/CertSync 1.22
110 TestFunctional/parallel/NodeLabels 0.07
112 TestFunctional/parallel/NonActiveRuntimeDisabled 0.21
114 TestFunctional/parallel/License 0.48
115 TestFunctional/parallel/Version/short 0.1
116 TestFunctional/parallel/Version/components 0.52
117 TestFunctional/parallel/ImageCommands/ImageListShort 0.19
118 TestFunctional/parallel/ImageCommands/ImageListTable 0.16
119 TestFunctional/parallel/ImageCommands/ImageListJson 0.16
120 TestFunctional/parallel/ImageCommands/ImageListYaml 0.17
121 TestFunctional/parallel/ImageCommands/ImageBuild 2.37
122 TestFunctional/parallel/ImageCommands/Setup 2.5
123 TestFunctional/parallel/DockerEnv/bash 0.92
124 TestFunctional/parallel/UpdateContextCmd/no_changes 0.22
125 TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster 0.25
126 TestFunctional/parallel/UpdateContextCmd/no_clusters 0.25
127 TestFunctional/parallel/ImageCommands/ImageLoadDaemon 3.51
128 TestFunctional/parallel/ImageCommands/ImageReloadDaemon 2.27
129 TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon 5.3
130 TestFunctional/parallel/ImageCommands/ImageSaveToFile 1.23
131 TestFunctional/parallel/ImageCommands/ImageRemove 0.35
132 TestFunctional/parallel/ImageCommands/ImageLoadFromFile 1.36
133 TestFunctional/parallel/ImageCommands/ImageSaveDaemon 1.3
134 TestFunctional/parallel/ServiceCmd/DeployApp 13.12
136 TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel 0.39
137 TestFunctional/parallel/TunnelCmd/serial/StartTunnel 0.02
139 TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup 10.17
140 TestFunctional/parallel/ServiceCmd/List 0.38
141 TestFunctional/parallel/ServiceCmd/JSONOutput 0.37
142 TestFunctional/parallel/ServiceCmd/HTTPS 0.25
143 TestFunctional/parallel/ServiceCmd/Format 0.26
144 TestFunctional/parallel/ServiceCmd/URL 0.26
145 TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP 0.05
146 TestFunctional/parallel/TunnelCmd/serial/AccessDirect 0.02
147 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig 0.04
148 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil 0.03
149 TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS 0.02
150 TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel 0.13
151 TestFunctional/parallel/ProfileCmd/profile_not_create 0.3
152 TestFunctional/parallel/ProfileCmd/profile_list 0.28
153 TestFunctional/parallel/ProfileCmd/profile_json_output 0.27
154 TestFunctional/parallel/MountCmd/any-port 6.06
155 TestFunctional/parallel/MountCmd/specific-port 1.62
156 TestFunctional/parallel/MountCmd/VerifyCleanup 1.4
157 TestFunctional/delete_addon-resizer_images 0.13
158 TestFunctional/delete_my-image_image 0.05
159 TestFunctional/delete_minikube_cached_images 0.05
165 TestIngressAddonLegacy/StartLegacyK8sCluster 73.53
167 TestIngressAddonLegacy/serial/ValidateIngressAddonActivation 20.31
168 TestIngressAddonLegacy/serial/ValidateIngressDNSAddonActivation 0.53
169 TestIngressAddonLegacy/serial/ValidateIngressAddons 36.23
172 TestJSONOutput/start/Command 48.61
173 TestJSONOutput/start/Audit 0
175 TestJSONOutput/start/parallel/DistinctCurrentSteps 0
176 TestJSONOutput/start/parallel/IncreasingCurrentSteps 0
178 TestJSONOutput/pause/Command 0.46
179 TestJSONOutput/pause/Audit 0
181 TestJSONOutput/pause/parallel/DistinctCurrentSteps 0
182 TestJSONOutput/pause/parallel/IncreasingCurrentSteps 0
184 TestJSONOutput/unpause/Command 0.43
185 TestJSONOutput/unpause/Audit 0
187 TestJSONOutput/unpause/parallel/DistinctCurrentSteps 0
188 TestJSONOutput/unpause/parallel/IncreasingCurrentSteps 0
190 TestJSONOutput/stop/Command 8.16
191 TestJSONOutput/stop/Audit 0
193 TestJSONOutput/stop/parallel/DistinctCurrentSteps 0
194 TestJSONOutput/stop/parallel/IncreasingCurrentSteps 0
195 TestErrorJSONOutput 0.77
200 TestMainNoArgs 0.08
201 TestMinikubeProfile 86.9
204 TestMountStart/serial/StartWithMountFirst 16.14
205 TestMountStart/serial/VerifyMountFirst 0.32
206 TestMountStart/serial/StartWithMountSecond 18.16
207 TestMountStart/serial/VerifyMountSecond 0.33
208 TestMountStart/serial/DeleteFirst 2.37
209 TestMountStart/serial/VerifyMountPostDelete 0.32
210 TestMountStart/serial/Stop 2.23
211 TestMountStart/serial/RestartStopped 16.71
212 TestMountStart/serial/VerifyMountPostStop 0.31
215 TestMultiNode/serial/FreshStart2Nodes 154.75
216 TestMultiNode/serial/DeployApp2Nodes 4.92
217 TestMultiNode/serial/PingHostFrom2Pods 0.91
218 TestMultiNode/serial/AddNode 32.17
219 TestMultiNode/serial/MultiNodeLabels 0.05
220 TestMultiNode/serial/ProfileList 0.2
221 TestMultiNode/serial/CopyFile 5.45
222 TestMultiNode/serial/StopNode 2.7
223 TestMultiNode/serial/StartAfterStop 27.15
224 TestMultiNode/serial/RestartKeepsNodes 142.39
225 TestMultiNode/serial/DeleteNode 2.9
226 TestMultiNode/serial/StopMultiNode 16.47
227 TestMultiNode/serial/RestartMultiNode 93.85
228 TestMultiNode/serial/ValidateNameConflict 47.58
232 TestPreload 170.35
234 TestScheduledStopUnix 103.37
235 TestSkaffold 112.49
238 TestRunningBinaryUpgrade 230.81
240 TestKubernetesUpgrade 189.67
253 TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current 3.98
254 TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current 6.84
255 TestStoppedBinaryUpgrade/Setup 0.92
256 TestStoppedBinaryUpgrade/Upgrade 152.38
258 TestPause/serial/Start 48.92
259 TestStoppedBinaryUpgrade/MinikubeLogs 2.43
268 TestNoKubernetes/serial/StartNoK8sWithVersion 0.44
269 TestNoKubernetes/serial/StartWithK8s 40.86
270 TestPause/serial/SecondStartNoReconfiguration 31.42
271 TestNoKubernetes/serial/StartWithStopK8s 16.28
272 TestNoKubernetes/serial/Start 16.14
273 TestPause/serial/Pause 0.52
274 TestPause/serial/VerifyStatus 0.16
275 TestPause/serial/Unpause 0.52
276 TestPause/serial/PauseAgain 0.58
277 TestPause/serial/DeletePaused 5.24
278 TestNoKubernetes/serial/VerifyK8sNotRunning 0.14
279 TestNoKubernetes/serial/ProfileList 30.89
280 TestPause/serial/VerifyDeletedResources 0.26
281 TestNetworkPlugins/group/auto/Start 90.26
282 TestNoKubernetes/serial/Stop 2.24
283 TestNoKubernetes/serial/StartNoArgs 17.33
284 TestNoKubernetes/serial/VerifyK8sNotRunningSecond 0.14
285 TestNetworkPlugins/group/kindnet/Start 58.72
286 TestNetworkPlugins/group/auto/KubeletFlags 0.16
287 TestNetworkPlugins/group/auto/NetCatPod 13.32
288 TestNetworkPlugins/group/auto/DNS 0.13
289 TestNetworkPlugins/group/auto/Localhost 0.11
290 TestNetworkPlugins/group/auto/HairPin 0.11
291 TestNetworkPlugins/group/kindnet/ControllerPod 6.01
292 TestNetworkPlugins/group/kindnet/KubeletFlags 0.16
293 TestNetworkPlugins/group/kindnet/NetCatPod 12.21
294 TestNetworkPlugins/group/calico/Start 80.41
295 TestNetworkPlugins/group/kindnet/DNS 0.13
296 TestNetworkPlugins/group/kindnet/Localhost 0.1
297 TestNetworkPlugins/group/kindnet/HairPin 0.11
298 TestNetworkPlugins/group/custom-flannel/Start 59.1
299 TestNetworkPlugins/group/calico/ControllerPod 6.01
300 TestNetworkPlugins/group/custom-flannel/KubeletFlags 0.15
301 TestNetworkPlugins/group/custom-flannel/NetCatPod 11.17
302 TestNetworkPlugins/group/calico/KubeletFlags 0.17
303 TestNetworkPlugins/group/calico/NetCatPod 10.22
304 TestNetworkPlugins/group/custom-flannel/DNS 0.13
305 TestNetworkPlugins/group/custom-flannel/Localhost 0.11
306 TestNetworkPlugins/group/custom-flannel/HairPin 0.1
307 TestNetworkPlugins/group/calico/DNS 0.12
308 TestNetworkPlugins/group/calico/Localhost 0.1
309 TestNetworkPlugins/group/calico/HairPin 0.11
310 TestNetworkPlugins/group/false/Start 50.6
311 TestNetworkPlugins/group/enable-default-cni/Start 58.12
312 TestNetworkPlugins/group/false/KubeletFlags 0.16
313 TestNetworkPlugins/group/false/NetCatPod 12.22
314 TestNetworkPlugins/group/false/DNS 0.15
315 TestNetworkPlugins/group/false/Localhost 0.1
316 TestNetworkPlugins/group/false/HairPin 0.11
317 TestNetworkPlugins/group/enable-default-cni/KubeletFlags 0.17
318 TestNetworkPlugins/group/enable-default-cni/NetCatPod 12.18
319 TestNetworkPlugins/group/enable-default-cni/DNS 0.14
320 TestNetworkPlugins/group/enable-default-cni/Localhost 0.11
321 TestNetworkPlugins/group/enable-default-cni/HairPin 0.1
322 TestNetworkPlugins/group/flannel/Start 59.03
323 TestNetworkPlugins/group/bridge/Start 49.31
324 TestNetworkPlugins/group/flannel/ControllerPod 6
325 TestNetworkPlugins/group/bridge/KubeletFlags 0.16
326 TestNetworkPlugins/group/bridge/NetCatPod 10.18
327 TestNetworkPlugins/group/flannel/KubeletFlags 0.17
328 TestNetworkPlugins/group/flannel/NetCatPod 11.18
329 TestNetworkPlugins/group/bridge/DNS 0.12
330 TestNetworkPlugins/group/bridge/Localhost 0.11
331 TestNetworkPlugins/group/bridge/HairPin 0.1
332 TestNetworkPlugins/group/flannel/DNS 0.13
333 TestNetworkPlugins/group/flannel/Localhost 0.11
334 TestNetworkPlugins/group/flannel/HairPin 0.1
335 TestNetworkPlugins/group/kubenet/Start 88.13
337 TestStartStop/group/old-k8s-version/serial/FirstStart 160.11
338 TestNetworkPlugins/group/kubenet/KubeletFlags 0.17
339 TestNetworkPlugins/group/kubenet/NetCatPod 11.19
340 TestNetworkPlugins/group/kubenet/DNS 0.13
341 TestNetworkPlugins/group/kubenet/Localhost 0.1
342 TestNetworkPlugins/group/kubenet/HairPin 0.11
347 TestStartStop/group/old-k8s-version/serial/DeployApp 9.31
348 TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive 0.73
349 TestStartStop/group/old-k8s-version/serial/Stop 8.29
350 TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop 0.34
351 TestStartStop/group/old-k8s-version/serial/SecondStart 497.48
352 TestStartStop/group/no-preload/serial/Stop 2.24
353 TestStartStop/group/no-preload/serial/EnableAddonAfterStop 0.32
354 TestStartStop/group/no-preload/serial/SecondStart 106.2
355 TestStartStop/group/no-preload/serial/UserAppExistsAfterStop 6.01
356 TestStartStop/group/no-preload/serial/AddonExistsAfterStop 5.07
357 TestStartStop/group/no-preload/serial/VerifyKubernetesImages 0.16
358 TestStartStop/group/no-preload/serial/Pause 1.83
360 TestStartStop/group/embed-certs/serial/FirstStart 50.26
361 TestStartStop/group/embed-certs/serial/DeployApp 9.24
362 TestStartStop/group/embed-certs/serial/EnableAddonWhileActive 0.9
363 TestStartStop/group/embed-certs/serial/Stop 8.28
364 TestStartStop/group/embed-certs/serial/EnableAddonAfterStop 0.32
366 TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop 6
367 TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop 5.06
368 TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages 0.17
369 TestStartStop/group/old-k8s-version/serial/Pause 1.83
376 TestStartStop/group/default-k8s-diff-port/serial/Stop 1.27
377 TestStartStop/group/default-k8s-diff-port/serial/EnableAddonAfterStop 0.32
380 TestStartStop/group/newest-cni/serial/FirstStart 594.88
381 TestStartStop/group/newest-cni/serial/DeployApp 0
382 TestStartStop/group/newest-cni/serial/EnableAddonWhileActive 0.86
383 TestStartStop/group/newest-cni/serial/Stop 8.28
384 TestStartStop/group/newest-cni/serial/EnableAddonAfterStop 0.32
385 TestStartStop/group/newest-cni/serial/SecondStart 35.7
386 TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop 0
387 TestStartStop/group/newest-cni/serial/AddonExistsAfterStop 0
388 TestStartStop/group/newest-cni/serial/VerifyKubernetesImages 0.17
389 TestStartStop/group/newest-cni/serial/Pause 1.74
x
+
TestDownloadOnly/v1.16.0/json-events (40.41s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/json-events
aaa_download_only_test.go:69: (dbg) Run:  out/minikube-darwin-amd64 start -o=json --download-only -p download-only-306000 --force --alsologtostderr --kubernetes-version=v1.16.0 --container-runtime=docker --driver=hyperkit 
aaa_download_only_test.go:69: (dbg) Done: out/minikube-darwin-amd64 start -o=json --download-only -p download-only-306000 --force --alsologtostderr --kubernetes-version=v1.16.0 --container-runtime=docker --driver=hyperkit : (40.411256661s)
--- PASS: TestDownloadOnly/v1.16.0/json-events (40.41s)

                                                
                                    
x
+
TestDownloadOnly/v1.16.0/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/preload-exists
--- PASS: TestDownloadOnly/v1.16.0/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.16.0/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/kubectl
--- PASS: TestDownloadOnly/v1.16.0/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.16.0/LogsDuration (0.31s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/LogsDuration
aaa_download_only_test.go:172: (dbg) Run:  out/minikube-darwin-amd64 logs -p download-only-306000
aaa_download_only_test.go:172: (dbg) Non-zero exit: out/minikube-darwin-amd64 logs -p download-only-306000: exit status 85 (312.344706ms)

                                                
                                                
-- stdout --
	* 
	* ==> Audit <==
	* |---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	| Command |              Args              |       Profile        |  User   | Version |     Start Time      | End Time |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	| start   | -o=json --download-only        | download-only-306000 | jenkins | v1.32.0 | 18 Dec 23 14:36 PST |          |
	|         | -p download-only-306000        |                      |         |         |                     |          |
	|         | --force --alsologtostderr      |                      |         |         |                     |          |
	|         | --kubernetes-version=v1.16.0   |                      |         |         |                     |          |
	|         | --container-runtime=docker     |                      |         |         |                     |          |
	|         | --driver=hyperkit              |                      |         |         |                     |          |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	
	* 
	* ==> Last Start <==
	* Log file created at: 2023/12/18 14:36:00
	Running on machine: MacOS-Agent-2
	Binary: Built with gc go1.21.5 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1218 14:36:00.538270    1485 out.go:296] Setting OutFile to fd 1 ...
	I1218 14:36:00.538575    1485 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1218 14:36:00.538581    1485 out.go:309] Setting ErrFile to fd 2...
	I1218 14:36:00.538585    1485 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1218 14:36:00.538780    1485 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17822-999/.minikube/bin
	W1218 14:36:00.538882    1485 root.go:314] Error reading config file at /Users/jenkins/minikube-integration/17822-999/.minikube/config/config.json: open /Users/jenkins/minikube-integration/17822-999/.minikube/config/config.json: no such file or directory
	I1218 14:36:00.540620    1485 out.go:303] Setting JSON to true
	I1218 14:36:00.565114    1485 start.go:128] hostinfo: {"hostname":"MacOS-Agent-2.local","uptime":331,"bootTime":1702938629,"procs":419,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.2","kernelVersion":"23.2.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"2965c349-98a5-5970-aaa9-9eedd3ae5959"}
	W1218 14:36:00.565209    1485 start.go:136] gopshost.Virtualization returned error: not implemented yet
	I1218 14:36:00.585661    1485 out.go:97] [download-only-306000] minikube v1.32.0 on Darwin 14.2
	I1218 14:36:00.609565    1485 out.go:169] MINIKUBE_LOCATION=17822
	W1218 14:36:00.585907    1485 preload.go:295] Failed to list preload files: open /Users/jenkins/minikube-integration/17822-999/.minikube/cache/preloaded-tarball: no such file or directory
	I1218 14:36:00.585903    1485 notify.go:220] Checking for updates...
	I1218 14:36:00.657531    1485 out.go:169] KUBECONFIG=/Users/jenkins/minikube-integration/17822-999/kubeconfig
	I1218 14:36:00.678687    1485 out.go:169] MINIKUBE_BIN=out/minikube-darwin-amd64
	I1218 14:36:00.700740    1485 out.go:169] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 14:36:00.721383    1485 out.go:169] MINIKUBE_HOME=/Users/jenkins/minikube-integration/17822-999/.minikube
	W1218 14:36:00.781718    1485 out.go:272] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I1218 14:36:00.782279    1485 driver.go:392] Setting default libvirt URI to qemu:///system
	I1218 14:36:00.871639    1485 out.go:97] Using the hyperkit driver based on user configuration
	I1218 14:36:00.871722    1485 start.go:298] selected driver: hyperkit
	I1218 14:36:00.871735    1485 start.go:902] validating driver "hyperkit" against <nil>
	I1218 14:36:00.871954    1485 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 14:36:00.872320    1485 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/17822-999/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I1218 14:36:00.982273    1485 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.32.0
	I1218 14:36:00.986739    1485 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 14:36:00.986758    1485 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I1218 14:36:00.986785    1485 start_flags.go:309] no existing cluster config was found, will generate one from the flags 
	I1218 14:36:00.991341    1485 start_flags.go:394] Using suggested 6000MB memory alloc based on sys=32768MB, container=0MB
	I1218 14:36:00.991487    1485 start_flags.go:913] Wait components to verify : map[apiserver:true system_pods:true]
	I1218 14:36:00.991551    1485 cni.go:84] Creating CNI manager for ""
	I1218 14:36:00.991565    1485 cni.go:162] CNI unnecessary in this configuration, recommending no CNI
	I1218 14:36:00.991574    1485 start_flags.go:323] config:
	{Name:download-only-306000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1702920864-17822@sha256:4842b362f06b33d847d73f7ed166c93ce608f4c4cea49b711c7055fd50ebd1e0 Memory:6000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.16.0 ClusterName:download-only-306000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Container
Runtime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 AutoPauseInterval:1m0s GPUs:}
	I1218 14:36:00.991849    1485 iso.go:125] acquiring lock: {Name:mk6c2133f2dd3312b15d4fc195383881e10096e9 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 14:36:01.013396    1485 out.go:97] Downloading VM boot image ...
	I1218 14:36:01.013485    1485 download.go:107] Downloading: https://storage.googleapis.com/minikube-builds/iso/17806/minikube-v1.32.1-1702708929-17806-amd64.iso?checksum=file:https://storage.googleapis.com/minikube-builds/iso/17806/minikube-v1.32.1-1702708929-17806-amd64.iso.sha256 -> /Users/jenkins/minikube-integration/17822-999/.minikube/cache/iso/amd64/minikube-v1.32.1-1702708929-17806-amd64.iso
	I1218 14:36:05.076014    1485 out.go:97] Starting control plane node download-only-306000 in cluster download-only-306000
	I1218 14:36:05.076051    1485 preload.go:132] Checking if preload exists for k8s version v1.16.0 and runtime docker
	I1218 14:36:05.131734    1485 preload.go:119] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.16.0/preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4
	I1218 14:36:05.131798    1485 cache.go:56] Caching tarball of preloaded images
	I1218 14:36:05.132142    1485 preload.go:132] Checking if preload exists for k8s version v1.16.0 and runtime docker
	I1218 14:36:05.152624    1485 out.go:97] Downloading Kubernetes v1.16.0 preload ...
	I1218 14:36:05.152714    1485 preload.go:238] getting checksum for preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4 ...
	I1218 14:36:05.231731    1485 download.go:107] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.16.0/preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4?checksum=md5:326f3ce331abb64565b50b8c9e791244 -> /Users/jenkins/minikube-integration/17822-999/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4
	I1218 14:36:10.126120    1485 preload.go:249] saving checksum for preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4 ...
	I1218 14:36:10.126291    1485 preload.go:256] verifying checksum of /Users/jenkins/minikube-integration/17822-999/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4 ...
	I1218 14:36:10.666763    1485 cache.go:59] Finished verifying existence of preloaded tar for  v1.16.0 on docker
	I1218 14:36:10.667026    1485 profile.go:148] Saving config to /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/download-only-306000/config.json ...
	I1218 14:36:10.667050    1485 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/download-only-306000/config.json: {Name:mk6643fea47a537eae939c441a725747580760f7 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 14:36:10.667383    1485 preload.go:132] Checking if preload exists for k8s version v1.16.0 and runtime docker
	I1218 14:36:10.667734    1485 download.go:107] Downloading: https://dl.k8s.io/release/v1.16.0/bin/darwin/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.16.0/bin/darwin/amd64/kubectl.sha1 -> /Users/jenkins/minikube-integration/17822-999/.minikube/cache/darwin/amd64/v1.16.0/kubectl
	
	* 
	* The control plane node "" does not exist.
	  To start a cluster, run: "minikube start -p download-only-306000"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:173: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.16.0/LogsDuration (0.31s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.4/json-events (9.15s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.4/json-events
aaa_download_only_test.go:69: (dbg) Run:  out/minikube-darwin-amd64 start -o=json --download-only -p download-only-306000 --force --alsologtostderr --kubernetes-version=v1.28.4 --container-runtime=docker --driver=hyperkit 
aaa_download_only_test.go:69: (dbg) Done: out/minikube-darwin-amd64 start -o=json --download-only -p download-only-306000 --force --alsologtostderr --kubernetes-version=v1.28.4 --container-runtime=docker --driver=hyperkit : (9.14884482s)
--- PASS: TestDownloadOnly/v1.28.4/json-events (9.15s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.4/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.4/preload-exists
--- PASS: TestDownloadOnly/v1.28.4/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.4/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.4/kubectl
--- PASS: TestDownloadOnly/v1.28.4/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.4/LogsDuration (0.29s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.4/LogsDuration
aaa_download_only_test.go:172: (dbg) Run:  out/minikube-darwin-amd64 logs -p download-only-306000
aaa_download_only_test.go:172: (dbg) Non-zero exit: out/minikube-darwin-amd64 logs -p download-only-306000: exit status 85 (292.420113ms)

                                                
                                                
-- stdout --
	* 
	* ==> Audit <==
	* |---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	| Command |              Args              |       Profile        |  User   | Version |     Start Time      | End Time |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	| start   | -o=json --download-only        | download-only-306000 | jenkins | v1.32.0 | 18 Dec 23 14:36 PST |          |
	|         | -p download-only-306000        |                      |         |         |                     |          |
	|         | --force --alsologtostderr      |                      |         |         |                     |          |
	|         | --kubernetes-version=v1.16.0   |                      |         |         |                     |          |
	|         | --container-runtime=docker     |                      |         |         |                     |          |
	|         | --driver=hyperkit              |                      |         |         |                     |          |
	| start   | -o=json --download-only        | download-only-306000 | jenkins | v1.32.0 | 18 Dec 23 14:36 PST |          |
	|         | -p download-only-306000        |                      |         |         |                     |          |
	|         | --force --alsologtostderr      |                      |         |         |                     |          |
	|         | --kubernetes-version=v1.28.4   |                      |         |         |                     |          |
	|         | --container-runtime=docker     |                      |         |         |                     |          |
	|         | --driver=hyperkit              |                      |         |         |                     |          |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	
	* 
	* ==> Last Start <==
	* Log file created at: 2023/12/18 14:36:41
	Running on machine: MacOS-Agent-2
	Binary: Built with gc go1.21.5 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1218 14:36:41.266189    1521 out.go:296] Setting OutFile to fd 1 ...
	I1218 14:36:41.266431    1521 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1218 14:36:41.266437    1521 out.go:309] Setting ErrFile to fd 2...
	I1218 14:36:41.266441    1521 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1218 14:36:41.266613    1521 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17822-999/.minikube/bin
	W1218 14:36:41.266708    1521 root.go:314] Error reading config file at /Users/jenkins/minikube-integration/17822-999/.minikube/config/config.json: open /Users/jenkins/minikube-integration/17822-999/.minikube/config/config.json: no such file or directory
	I1218 14:36:41.267944    1521 out.go:303] Setting JSON to true
	I1218 14:36:41.289830    1521 start.go:128] hostinfo: {"hostname":"MacOS-Agent-2.local","uptime":372,"bootTime":1702938629,"procs":423,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.2","kernelVersion":"23.2.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"2965c349-98a5-5970-aaa9-9eedd3ae5959"}
	W1218 14:36:41.289919    1521 start.go:136] gopshost.Virtualization returned error: not implemented yet
	I1218 14:36:41.311344    1521 out.go:97] [download-only-306000] minikube v1.32.0 on Darwin 14.2
	I1218 14:36:41.332827    1521 out.go:169] MINIKUBE_LOCATION=17822
	I1218 14:36:41.311579    1521 notify.go:220] Checking for updates...
	I1218 14:36:41.375929    1521 out.go:169] KUBECONFIG=/Users/jenkins/minikube-integration/17822-999/kubeconfig
	I1218 14:36:41.417866    1521 out.go:169] MINIKUBE_BIN=out/minikube-darwin-amd64
	I1218 14:36:41.459711    1521 out.go:169] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 14:36:41.481050    1521 out.go:169] MINIKUBE_HOME=/Users/jenkins/minikube-integration/17822-999/.minikube
	W1218 14:36:41.522805    1521 out.go:272] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I1218 14:36:41.523617    1521 config.go:182] Loaded profile config "download-only-306000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.16.0
	W1218 14:36:41.523701    1521 start.go:810] api.Load failed for download-only-306000: filestore "download-only-306000": Docker machine "download-only-306000" does not exist. Use "docker-machine ls" to list machines. Use "docker-machine create" to add a new one.
	I1218 14:36:41.523846    1521 driver.go:392] Setting default libvirt URI to qemu:///system
	W1218 14:36:41.523885    1521 start.go:810] api.Load failed for download-only-306000: filestore "download-only-306000": Docker machine "download-only-306000" does not exist. Use "docker-machine ls" to list machines. Use "docker-machine create" to add a new one.
	I1218 14:36:41.553903    1521 out.go:97] Using the hyperkit driver based on existing profile
	I1218 14:36:41.553987    1521 start.go:298] selected driver: hyperkit
	I1218 14:36:41.554000    1521 start.go:902] validating driver "hyperkit" against &{Name:download-only-306000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/17806/minikube-v1.32.1-1702708929-17806-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1702920864-17822@sha256:4842b362f06b33d847d73f7ed166c93ce608f4c4cea49b711c7055fd50ebd1e0 Memory:6000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 Kuber
netesConfig:{KubernetesVersion:v1.16.0 ClusterName:download-only-306000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.16.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirro
r: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 AutoPauseInterval:1m0s GPUs:}
	I1218 14:36:41.554353    1521 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 14:36:41.554561    1521 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/17822-999/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I1218 14:36:41.563671    1521 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.32.0
	I1218 14:36:41.567534    1521 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 14:36:41.567555    1521 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I1218 14:36:41.570320    1521 cni.go:84] Creating CNI manager for ""
	I1218 14:36:41.570340    1521 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I1218 14:36:41.570354    1521 start_flags.go:323] config:
	{Name:download-only-306000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/17806/minikube-v1.32.1-1702708929-17806-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1702920864-17822@sha256:4842b362f06b33d847d73f7ed166c93ce608f4c4cea49b711c7055fd50ebd1e0 Memory:6000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.4 ClusterName:download-only-306000 Namespace:
default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.16.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketV
MnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 AutoPauseInterval:1m0s GPUs:}
	I1218 14:36:41.570486    1521 iso.go:125] acquiring lock: {Name:mk6c2133f2dd3312b15d4fc195383881e10096e9 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 14:36:41.591854    1521 out.go:97] Starting control plane node download-only-306000 in cluster download-only-306000
	I1218 14:36:41.591887    1521 preload.go:132] Checking if preload exists for k8s version v1.28.4 and runtime docker
	I1218 14:36:41.647635    1521 preload.go:119] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.28.4/preloaded-images-k8s-v18-v1.28.4-docker-overlay2-amd64.tar.lz4
	I1218 14:36:41.647729    1521 cache.go:56] Caching tarball of preloaded images
	I1218 14:36:41.648185    1521 preload.go:132] Checking if preload exists for k8s version v1.28.4 and runtime docker
	I1218 14:36:41.669731    1521 out.go:97] Downloading Kubernetes v1.28.4 preload ...
	I1218 14:36:41.669752    1521 preload.go:238] getting checksum for preloaded-images-k8s-v18-v1.28.4-docker-overlay2-amd64.tar.lz4 ...
	I1218 14:36:41.752133    1521 download.go:107] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.28.4/preloaded-images-k8s-v18-v1.28.4-docker-overlay2-amd64.tar.lz4?checksum=md5:7ebdea7754e21f51b865dbfc36b53b7d -> /Users/jenkins/minikube-integration/17822-999/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-docker-overlay2-amd64.tar.lz4
	
	* 
	* The control plane node "" does not exist.
	  To start a cluster, run: "minikube start -p download-only-306000"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:173: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.28.4/LogsDuration (0.29s)

                                                
                                    
x
+
TestDownloadOnly/v1.29.0-rc.2/json-events (44.21s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.29.0-rc.2/json-events
aaa_download_only_test.go:69: (dbg) Run:  out/minikube-darwin-amd64 start -o=json --download-only -p download-only-306000 --force --alsologtostderr --kubernetes-version=v1.29.0-rc.2 --container-runtime=docker --driver=hyperkit 
aaa_download_only_test.go:69: (dbg) Done: out/minikube-darwin-amd64 start -o=json --download-only -p download-only-306000 --force --alsologtostderr --kubernetes-version=v1.29.0-rc.2 --container-runtime=docker --driver=hyperkit : (44.209403945s)
--- PASS: TestDownloadOnly/v1.29.0-rc.2/json-events (44.21s)

                                                
                                    
x
+
TestDownloadOnly/v1.29.0-rc.2/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.29.0-rc.2/preload-exists
--- PASS: TestDownloadOnly/v1.29.0-rc.2/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.29.0-rc.2/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.29.0-rc.2/kubectl
--- PASS: TestDownloadOnly/v1.29.0-rc.2/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.29.0-rc.2/LogsDuration (0.33s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.29.0-rc.2/LogsDuration
aaa_download_only_test.go:172: (dbg) Run:  out/minikube-darwin-amd64 logs -p download-only-306000
aaa_download_only_test.go:172: (dbg) Non-zero exit: out/minikube-darwin-amd64 logs -p download-only-306000: exit status 85 (333.787203ms)

                                                
                                                
-- stdout --
	* 
	* ==> Audit <==
	* |---------|-----------------------------------|----------------------|---------|---------|---------------------|----------|
	| Command |               Args                |       Profile        |  User   | Version |     Start Time      | End Time |
	|---------|-----------------------------------|----------------------|---------|---------|---------------------|----------|
	| start   | -o=json --download-only           | download-only-306000 | jenkins | v1.32.0 | 18 Dec 23 14:36 PST |          |
	|         | -p download-only-306000           |                      |         |         |                     |          |
	|         | --force --alsologtostderr         |                      |         |         |                     |          |
	|         | --kubernetes-version=v1.16.0      |                      |         |         |                     |          |
	|         | --container-runtime=docker        |                      |         |         |                     |          |
	|         | --driver=hyperkit                 |                      |         |         |                     |          |
	| start   | -o=json --download-only           | download-only-306000 | jenkins | v1.32.0 | 18 Dec 23 14:36 PST |          |
	|         | -p download-only-306000           |                      |         |         |                     |          |
	|         | --force --alsologtostderr         |                      |         |         |                     |          |
	|         | --kubernetes-version=v1.28.4      |                      |         |         |                     |          |
	|         | --container-runtime=docker        |                      |         |         |                     |          |
	|         | --driver=hyperkit                 |                      |         |         |                     |          |
	| start   | -o=json --download-only           | download-only-306000 | jenkins | v1.32.0 | 18 Dec 23 14:36 PST |          |
	|         | -p download-only-306000           |                      |         |         |                     |          |
	|         | --force --alsologtostderr         |                      |         |         |                     |          |
	|         | --kubernetes-version=v1.29.0-rc.2 |                      |         |         |                     |          |
	|         | --container-runtime=docker        |                      |         |         |                     |          |
	|         | --driver=hyperkit                 |                      |         |         |                     |          |
	|---------|-----------------------------------|----------------------|---------|---------|---------------------|----------|
	
	* 
	* ==> Last Start <==
	* Log file created at: 2023/12/18 14:36:50
	Running on machine: MacOS-Agent-2
	Binary: Built with gc go1.21.5 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1218 14:36:50.706945    1537 out.go:296] Setting OutFile to fd 1 ...
	I1218 14:36:50.707245    1537 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1218 14:36:50.707249    1537 out.go:309] Setting ErrFile to fd 2...
	I1218 14:36:50.707253    1537 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1218 14:36:50.707443    1537 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17822-999/.minikube/bin
	W1218 14:36:50.707542    1537 root.go:314] Error reading config file at /Users/jenkins/minikube-integration/17822-999/.minikube/config/config.json: open /Users/jenkins/minikube-integration/17822-999/.minikube/config/config.json: no such file or directory
	I1218 14:36:50.708711    1537 out.go:303] Setting JSON to true
	I1218 14:36:50.730890    1537 start.go:128] hostinfo: {"hostname":"MacOS-Agent-2.local","uptime":381,"bootTime":1702938629,"procs":424,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.2","kernelVersion":"23.2.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"2965c349-98a5-5970-aaa9-9eedd3ae5959"}
	W1218 14:36:50.730991    1537 start.go:136] gopshost.Virtualization returned error: not implemented yet
	I1218 14:36:50.753222    1537 out.go:97] [download-only-306000] minikube v1.32.0 on Darwin 14.2
	I1218 14:36:50.774764    1537 out.go:169] MINIKUBE_LOCATION=17822
	I1218 14:36:50.753495    1537 notify.go:220] Checking for updates...
	I1218 14:36:50.818073    1537 out.go:169] KUBECONFIG=/Users/jenkins/minikube-integration/17822-999/kubeconfig
	I1218 14:36:50.839664    1537 out.go:169] MINIKUBE_BIN=out/minikube-darwin-amd64
	I1218 14:36:50.860936    1537 out.go:169] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 14:36:50.881993    1537 out.go:169] MINIKUBE_HOME=/Users/jenkins/minikube-integration/17822-999/.minikube
	W1218 14:36:50.945841    1537 out.go:272] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I1218 14:36:50.946595    1537 config.go:182] Loaded profile config "download-only-306000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
	W1218 14:36:50.946687    1537 start.go:810] api.Load failed for download-only-306000: filestore "download-only-306000": Docker machine "download-only-306000" does not exist. Use "docker-machine ls" to list machines. Use "docker-machine create" to add a new one.
	I1218 14:36:50.946851    1537 driver.go:392] Setting default libvirt URI to qemu:///system
	W1218 14:36:50.946908    1537 start.go:810] api.Load failed for download-only-306000: filestore "download-only-306000": Docker machine "download-only-306000" does not exist. Use "docker-machine ls" to list machines. Use "docker-machine create" to add a new one.
	I1218 14:36:50.976689    1537 out.go:97] Using the hyperkit driver based on existing profile
	I1218 14:36:50.976749    1537 start.go:298] selected driver: hyperkit
	I1218 14:36:50.976764    1537 start.go:902] validating driver "hyperkit" against &{Name:download-only-306000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/17806/minikube-v1.32.1-1702708929-17806-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1702920864-17822@sha256:4842b362f06b33d847d73f7ed166c93ce608f4c4cea49b711c7055fd50ebd1e0 Memory:6000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 Kuber
netesConfig:{KubernetesVersion:v1.28.4 ClusterName:download-only-306000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirro
r: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 AutoPauseInterval:1m0s GPUs:}
	I1218 14:36:50.977186    1537 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 14:36:50.977383    1537 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/17822-999/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I1218 14:36:50.986940    1537 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.32.0
	I1218 14:36:50.991288    1537 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 14:36:50.991307    1537 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I1218 14:36:50.994126    1537 cni.go:84] Creating CNI manager for ""
	I1218 14:36:50.994145    1537 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I1218 14:36:50.994158    1537 start_flags.go:323] config:
	{Name:download-only-306000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/17806/minikube-v1.32.1-1702708929-17806-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1702920864-17822@sha256:4842b362f06b33d847d73f7ed166c93ce608f4c4cea49b711c7055fd50ebd1e0 Memory:6000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.29.0-rc.2 ClusterName:download-only-306000 Names
pace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: So
cketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 AutoPauseInterval:1m0s GPUs:}
	I1218 14:36:50.994335    1537 iso.go:125] acquiring lock: {Name:mk6c2133f2dd3312b15d4fc195383881e10096e9 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 14:36:51.015768    1537 out.go:97] Starting control plane node download-only-306000 in cluster download-only-306000
	I1218 14:36:51.015803    1537 preload.go:132] Checking if preload exists for k8s version v1.29.0-rc.2 and runtime docker
	I1218 14:36:51.073607    1537 preload.go:119] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.29.0-rc.2/preloaded-images-k8s-v18-v1.29.0-rc.2-docker-overlay2-amd64.tar.lz4
	I1218 14:36:51.073690    1537 cache.go:56] Caching tarball of preloaded images
	I1218 14:36:51.074140    1537 preload.go:132] Checking if preload exists for k8s version v1.29.0-rc.2 and runtime docker
	I1218 14:36:51.095498    1537 out.go:97] Downloading Kubernetes v1.29.0-rc.2 preload ...
	I1218 14:36:51.095526    1537 preload.go:238] getting checksum for preloaded-images-k8s-v18-v1.29.0-rc.2-docker-overlay2-amd64.tar.lz4 ...
	I1218 14:36:51.177876    1537 download.go:107] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.29.0-rc.2/preloaded-images-k8s-v18-v1.29.0-rc.2-docker-overlay2-amd64.tar.lz4?checksum=md5:74b99cd9fa76659778caad266ad399ba -> /Users/jenkins/minikube-integration/17822-999/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.29.0-rc.2-docker-overlay2-amd64.tar.lz4
	I1218 14:36:56.244985    1537 preload.go:249] saving checksum for preloaded-images-k8s-v18-v1.29.0-rc.2-docker-overlay2-amd64.tar.lz4 ...
	I1218 14:36:56.245174    1537 preload.go:256] verifying checksum of /Users/jenkins/minikube-integration/17822-999/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.29.0-rc.2-docker-overlay2-amd64.tar.lz4 ...
	I1218 14:36:56.783998    1537 cache.go:59] Finished verifying existence of preloaded tar for  v1.29.0-rc.2 on docker
	I1218 14:36:56.784085    1537 profile.go:148] Saving config to /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/download-only-306000/config.json ...
	I1218 14:36:56.784471    1537 preload.go:132] Checking if preload exists for k8s version v1.29.0-rc.2 and runtime docker
	I1218 14:36:56.784702    1537 download.go:107] Downloading: https://dl.k8s.io/release/v1.29.0-rc.2/bin/darwin/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.29.0-rc.2/bin/darwin/amd64/kubectl.sha256 -> /Users/jenkins/minikube-integration/17822-999/.minikube/cache/darwin/amd64/v1.29.0-rc.2/kubectl
	
	* 
	* The control plane node "" does not exist.
	  To start a cluster, run: "minikube start -p download-only-306000"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:173: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.29.0-rc.2/LogsDuration (0.33s)

                                                
                                    
x
+
TestDownloadOnly/DeleteAll (0.4s)

                                                
                                                
=== RUN   TestDownloadOnly/DeleteAll
aaa_download_only_test.go:190: (dbg) Run:  out/minikube-darwin-amd64 delete --all
--- PASS: TestDownloadOnly/DeleteAll (0.40s)

                                                
                                    
x
+
TestDownloadOnly/DeleteAlwaysSucceeds (0.37s)

                                                
                                                
=== RUN   TestDownloadOnly/DeleteAlwaysSucceeds
aaa_download_only_test.go:202: (dbg) Run:  out/minikube-darwin-amd64 delete -p download-only-306000
--- PASS: TestDownloadOnly/DeleteAlwaysSucceeds (0.37s)

                                                
                                    
x
+
TestBinaryMirror (1.02s)

                                                
                                                
=== RUN   TestBinaryMirror
aaa_download_only_test.go:307: (dbg) Run:  out/minikube-darwin-amd64 start --download-only -p binary-mirror-694000 --alsologtostderr --binary-mirror http://127.0.0.1:49356 --driver=hyperkit 
helpers_test.go:175: Cleaning up "binary-mirror-694000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p binary-mirror-694000
--- PASS: TestBinaryMirror (1.02s)

                                                
                                    
x
+
TestOffline (65.34s)

                                                
                                                
=== RUN   TestOffline
=== PAUSE TestOffline

                                                
                                                

                                                
                                                
=== CONT  TestOffline
aab_offline_test.go:55: (dbg) Run:  out/minikube-darwin-amd64 start -p offline-docker-896000 --alsologtostderr -v=1 --memory=2048 --wait=true --driver=hyperkit 
aab_offline_test.go:55: (dbg) Done: out/minikube-darwin-amd64 start -p offline-docker-896000 --alsologtostderr -v=1 --memory=2048 --wait=true --driver=hyperkit : (1m0.011368802s)
helpers_test.go:175: Cleaning up "offline-docker-896000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p offline-docker-896000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p offline-docker-896000: (5.330936142s)
--- PASS: TestOffline (65.34s)

                                                
                                    
x
+
TestAddons/PreSetup/EnablingAddonOnNonExistingCluster (0.19s)

                                                
                                                
=== RUN   TestAddons/PreSetup/EnablingAddonOnNonExistingCluster
=== PAUSE TestAddons/PreSetup/EnablingAddonOnNonExistingCluster

                                                
                                                

                                                
                                                
=== CONT  TestAddons/PreSetup/EnablingAddonOnNonExistingCluster
addons_test.go:927: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p addons-080000
addons_test.go:927: (dbg) Non-zero exit: out/minikube-darwin-amd64 addons enable dashboard -p addons-080000: exit status 85 (187.358815ms)

                                                
                                                
-- stdout --
	* Profile "addons-080000" not found. Run "minikube profile list" to view all profiles.
	  To start a cluster, run: "minikube start -p addons-080000"

                                                
                                                
-- /stdout --
--- PASS: TestAddons/PreSetup/EnablingAddonOnNonExistingCluster (0.19s)

                                                
                                    
x
+
TestAddons/PreSetup/DisablingAddonOnNonExistingCluster (0.21s)

                                                
                                                
=== RUN   TestAddons/PreSetup/DisablingAddonOnNonExistingCluster
=== PAUSE TestAddons/PreSetup/DisablingAddonOnNonExistingCluster

                                                
                                                

                                                
                                                
=== CONT  TestAddons/PreSetup/DisablingAddonOnNonExistingCluster
addons_test.go:938: (dbg) Run:  out/minikube-darwin-amd64 addons disable dashboard -p addons-080000
addons_test.go:938: (dbg) Non-zero exit: out/minikube-darwin-amd64 addons disable dashboard -p addons-080000: exit status 85 (208.306795ms)

                                                
                                                
-- stdout --
	* Profile "addons-080000" not found. Run "minikube profile list" to view all profiles.
	  To start a cluster, run: "minikube start -p addons-080000"

                                                
                                                
-- /stdout --
--- PASS: TestAddons/PreSetup/DisablingAddonOnNonExistingCluster (0.21s)

                                                
                                    
x
+
TestAddons/Setup (139.54s)

                                                
                                                
=== RUN   TestAddons/Setup
addons_test.go:109: (dbg) Run:  out/minikube-darwin-amd64 start -p addons-080000 --wait=true --memory=4000 --alsologtostderr --addons=registry --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --addons=storage-provisioner-rancher --addons=nvidia-device-plugin --driver=hyperkit  --addons=ingress --addons=ingress-dns --addons=helm-tiller
addons_test.go:109: (dbg) Done: out/minikube-darwin-amd64 start -p addons-080000 --wait=true --memory=4000 --alsologtostderr --addons=registry --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --addons=storage-provisioner-rancher --addons=nvidia-device-plugin --driver=hyperkit  --addons=ingress --addons=ingress-dns --addons=helm-tiller: (2m19.541853601s)
--- PASS: TestAddons/Setup (139.54s)

                                                
                                    
x
+
TestAddons/parallel/Registry (14.69s)

                                                
                                                
=== RUN   TestAddons/parallel/Registry
=== PAUSE TestAddons/parallel/Registry

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:329: registry stabilized in 11.123345ms
addons_test.go:331: (dbg) TestAddons/parallel/Registry: waiting 6m0s for pods matching "actual-registry=true" in namespace "kube-system" ...
helpers_test.go:344: "registry-nmpzf" [a8fe2cce-e5d8-407e-85f9-7ec4c844e514] Running
addons_test.go:331: (dbg) TestAddons/parallel/Registry: actual-registry=true healthy within 5.005075521s
addons_test.go:334: (dbg) TestAddons/parallel/Registry: waiting 10m0s for pods matching "registry-proxy=true" in namespace "kube-system" ...
helpers_test.go:344: "registry-proxy-j2z2g" [89a7f666-dfb5-4ea9-979e-ebe884846914] Running
addons_test.go:334: (dbg) TestAddons/parallel/Registry: registry-proxy=true healthy within 5.00327073s
addons_test.go:339: (dbg) Run:  kubectl --context addons-080000 delete po -l run=registry-test --now
addons_test.go:344: (dbg) Run:  kubectl --context addons-080000 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local"
addons_test.go:344: (dbg) Done: kubectl --context addons-080000 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local": (3.982038374s)
addons_test.go:358: (dbg) Run:  out/minikube-darwin-amd64 -p addons-080000 ip
2023/12/18 14:40:11 [DEBUG] GET http://192.169.0.3:5000
addons_test.go:387: (dbg) Run:  out/minikube-darwin-amd64 -p addons-080000 addons disable registry --alsologtostderr -v=1
--- PASS: TestAddons/parallel/Registry (14.69s)

                                                
                                    
x
+
TestAddons/parallel/Ingress (19.98s)

                                                
                                                
=== RUN   TestAddons/parallel/Ingress
=== PAUSE TestAddons/parallel/Ingress

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Ingress
addons_test.go:206: (dbg) Run:  kubectl --context addons-080000 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s
addons_test.go:231: (dbg) Run:  kubectl --context addons-080000 replace --force -f testdata/nginx-ingress-v1.yaml
addons_test.go:244: (dbg) Run:  kubectl --context addons-080000 replace --force -f testdata/nginx-pod-svc.yaml
addons_test.go:249: (dbg) TestAddons/parallel/Ingress: waiting 8m0s for pods matching "run=nginx" in namespace "default" ...
helpers_test.go:344: "nginx" [80147b53-7ea8-4948-a4b5-7b3787f8251e] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:344: "nginx" [80147b53-7ea8-4948-a4b5-7b3787f8251e] Running
addons_test.go:249: (dbg) TestAddons/parallel/Ingress: run=nginx healthy within 10.004295434s
addons_test.go:261: (dbg) Run:  out/minikube-darwin-amd64 -p addons-080000 ssh "curl -s http://127.0.0.1/ -H 'Host: nginx.example.com'"
addons_test.go:285: (dbg) Run:  kubectl --context addons-080000 replace --force -f testdata/ingress-dns-example-v1.yaml
addons_test.go:290: (dbg) Run:  out/minikube-darwin-amd64 -p addons-080000 ip
addons_test.go:296: (dbg) Run:  nslookup hello-john.test 192.169.0.3
addons_test.go:305: (dbg) Run:  out/minikube-darwin-amd64 -p addons-080000 addons disable ingress-dns --alsologtostderr -v=1
addons_test.go:310: (dbg) Run:  out/minikube-darwin-amd64 -p addons-080000 addons disable ingress --alsologtostderr -v=1
addons_test.go:310: (dbg) Done: out/minikube-darwin-amd64 -p addons-080000 addons disable ingress --alsologtostderr -v=1: (7.480607056s)
--- PASS: TestAddons/parallel/Ingress (19.98s)

                                                
                                    
x
+
TestAddons/parallel/InspektorGadget (10.54s)

                                                
                                                
=== RUN   TestAddons/parallel/InspektorGadget
=== PAUSE TestAddons/parallel/InspektorGadget

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/InspektorGadget
addons_test.go:837: (dbg) TestAddons/parallel/InspektorGadget: waiting 8m0s for pods matching "k8s-app=gadget" in namespace "gadget" ...
helpers_test.go:344: "gadget-cqzxf" [e0eb891b-0cd8-4cdf-b825-008467d26945] Running / Ready:ContainersNotReady (containers with unready status: [gadget]) / ContainersReady:ContainersNotReady (containers with unready status: [gadget])
addons_test.go:837: (dbg) TestAddons/parallel/InspektorGadget: k8s-app=gadget healthy within 5.006050276s
addons_test.go:840: (dbg) Run:  out/minikube-darwin-amd64 addons disable inspektor-gadget -p addons-080000
addons_test.go:840: (dbg) Done: out/minikube-darwin-amd64 addons disable inspektor-gadget -p addons-080000: (5.535141193s)
--- PASS: TestAddons/parallel/InspektorGadget (10.54s)

                                                
                                    
x
+
TestAddons/parallel/MetricsServer (5.48s)

                                                
                                                
=== RUN   TestAddons/parallel/MetricsServer
=== PAUSE TestAddons/parallel/MetricsServer

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/MetricsServer
addons_test.go:406: metrics-server stabilized in 2.924556ms
addons_test.go:408: (dbg) TestAddons/parallel/MetricsServer: waiting 6m0s for pods matching "k8s-app=metrics-server" in namespace "kube-system" ...
helpers_test.go:344: "metrics-server-7c66d45ddc-4tj86" [0a35a95e-f517-4309-b326-d25a1c021742] Running
addons_test.go:408: (dbg) TestAddons/parallel/MetricsServer: k8s-app=metrics-server healthy within 5.004670699s
addons_test.go:414: (dbg) Run:  kubectl --context addons-080000 top pods -n kube-system
addons_test.go:431: (dbg) Run:  out/minikube-darwin-amd64 -p addons-080000 addons disable metrics-server --alsologtostderr -v=1
--- PASS: TestAddons/parallel/MetricsServer (5.48s)

                                                
                                    
x
+
TestAddons/parallel/HelmTiller (11.38s)

                                                
                                                
=== RUN   TestAddons/parallel/HelmTiller
=== PAUSE TestAddons/parallel/HelmTiller

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/HelmTiller
addons_test.go:455: tiller-deploy stabilized in 2.506686ms
addons_test.go:457: (dbg) TestAddons/parallel/HelmTiller: waiting 6m0s for pods matching "app=helm" in namespace "kube-system" ...
helpers_test.go:344: "tiller-deploy-7b677967b9-cggzh" [c4428805-c46b-4b7e-aede-5fc466cc2711] Running
addons_test.go:457: (dbg) TestAddons/parallel/HelmTiller: app=helm healthy within 6.003137205s
addons_test.go:472: (dbg) Run:  kubectl --context addons-080000 run --rm helm-test --restart=Never --image=docker.io/alpine/helm:2.16.3 -it --namespace=kube-system -- version
addons_test.go:472: (dbg) Done: kubectl --context addons-080000 run --rm helm-test --restart=Never --image=docker.io/alpine/helm:2.16.3 -it --namespace=kube-system -- version: (4.943128331s)
addons_test.go:489: (dbg) Run:  out/minikube-darwin-amd64 -p addons-080000 addons disable helm-tiller --alsologtostderr -v=1
--- PASS: TestAddons/parallel/HelmTiller (11.38s)

                                                
                                    
x
+
TestAddons/parallel/CSI (51.26s)

                                                
                                                
=== RUN   TestAddons/parallel/CSI
=== PAUSE TestAddons/parallel/CSI

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CSI
addons_test.go:560: csi-hostpath-driver pods stabilized in 11.301949ms
addons_test.go:563: (dbg) Run:  kubectl --context addons-080000 create -f testdata/csi-hostpath-driver/pvc.yaml
addons_test.go:568: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc" in namespace "default" ...
helpers_test.go:394: (dbg) Run:  kubectl --context addons-080000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-080000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-080000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-080000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-080000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-080000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-080000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-080000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-080000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-080000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-080000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-080000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-080000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-080000 get pvc hpvc -o jsonpath={.status.phase} -n default
addons_test.go:573: (dbg) Run:  kubectl --context addons-080000 create -f testdata/csi-hostpath-driver/pv-pod.yaml
addons_test.go:578: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod" in namespace "default" ...
helpers_test.go:344: "task-pv-pod" [e91e79a3-04ac-4d33-b0d6-d7ff754f55c4] Pending
helpers_test.go:344: "task-pv-pod" [e91e79a3-04ac-4d33-b0d6-d7ff754f55c4] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])
helpers_test.go:344: "task-pv-pod" [e91e79a3-04ac-4d33-b0d6-d7ff754f55c4] Running
addons_test.go:578: (dbg) TestAddons/parallel/CSI: app=task-pv-pod healthy within 14.004381413s
addons_test.go:583: (dbg) Run:  kubectl --context addons-080000 create -f testdata/csi-hostpath-driver/snapshot.yaml
addons_test.go:588: (dbg) TestAddons/parallel/CSI: waiting 6m0s for volume snapshot "new-snapshot-demo" in namespace "default" ...
helpers_test.go:419: (dbg) Run:  kubectl --context addons-080000 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
helpers_test.go:419: (dbg) Run:  kubectl --context addons-080000 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
addons_test.go:593: (dbg) Run:  kubectl --context addons-080000 delete pod task-pv-pod
addons_test.go:599: (dbg) Run:  kubectl --context addons-080000 delete pvc hpvc
addons_test.go:605: (dbg) Run:  kubectl --context addons-080000 create -f testdata/csi-hostpath-driver/pvc-restore.yaml
addons_test.go:610: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc-restore" in namespace "default" ...
helpers_test.go:394: (dbg) Run:  kubectl --context addons-080000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-080000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-080000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-080000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-080000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
addons_test.go:615: (dbg) Run:  kubectl --context addons-080000 create -f testdata/csi-hostpath-driver/pv-pod-restore.yaml
addons_test.go:620: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod-restore" in namespace "default" ...
helpers_test.go:344: "task-pv-pod-restore" [8c2b752b-12e8-4e04-800b-80606fd8f0e7] Pending
helpers_test.go:344: "task-pv-pod-restore" [8c2b752b-12e8-4e04-800b-80606fd8f0e7] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])
helpers_test.go:344: "task-pv-pod-restore" [8c2b752b-12e8-4e04-800b-80606fd8f0e7] Running
addons_test.go:620: (dbg) TestAddons/parallel/CSI: app=task-pv-pod-restore healthy within 9.003335348s
addons_test.go:625: (dbg) Run:  kubectl --context addons-080000 delete pod task-pv-pod-restore
addons_test.go:629: (dbg) Run:  kubectl --context addons-080000 delete pvc hpvc-restore
addons_test.go:633: (dbg) Run:  kubectl --context addons-080000 delete volumesnapshot new-snapshot-demo
addons_test.go:637: (dbg) Run:  out/minikube-darwin-amd64 -p addons-080000 addons disable csi-hostpath-driver --alsologtostderr -v=1
addons_test.go:637: (dbg) Done: out/minikube-darwin-amd64 -p addons-080000 addons disable csi-hostpath-driver --alsologtostderr -v=1: (6.389994293s)
addons_test.go:641: (dbg) Run:  out/minikube-darwin-amd64 -p addons-080000 addons disable volumesnapshots --alsologtostderr -v=1
--- PASS: TestAddons/parallel/CSI (51.26s)

                                                
                                    
x
+
TestAddons/parallel/Headlamp (12.98s)

                                                
                                                
=== RUN   TestAddons/parallel/Headlamp
=== PAUSE TestAddons/parallel/Headlamp

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Headlamp
addons_test.go:823: (dbg) Run:  out/minikube-darwin-amd64 addons enable headlamp -p addons-080000 --alsologtostderr -v=1
addons_test.go:828: (dbg) TestAddons/parallel/Headlamp: waiting 8m0s for pods matching "app.kubernetes.io/name=headlamp" in namespace "headlamp" ...
helpers_test.go:344: "headlamp-777fd4b855-k4htw" [f1ee26b5-194c-4a77-b7c2-1d6cb5999383] Pending / Ready:ContainersNotReady (containers with unready status: [headlamp]) / ContainersReady:ContainersNotReady (containers with unready status: [headlamp])
helpers_test.go:344: "headlamp-777fd4b855-k4htw" [f1ee26b5-194c-4a77-b7c2-1d6cb5999383] Running
addons_test.go:828: (dbg) TestAddons/parallel/Headlamp: app.kubernetes.io/name=headlamp healthy within 12.002417436s
--- PASS: TestAddons/parallel/Headlamp (12.98s)

                                                
                                    
x
+
TestAddons/parallel/CloudSpanner (5.42s)

                                                
                                                
=== RUN   TestAddons/parallel/CloudSpanner
=== PAUSE TestAddons/parallel/CloudSpanner

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CloudSpanner
addons_test.go:856: (dbg) TestAddons/parallel/CloudSpanner: waiting 6m0s for pods matching "app=cloud-spanner-emulator" in namespace "default" ...
helpers_test.go:344: "cloud-spanner-emulator-5649c69bf6-tngbf" [9adfe2d8-f63a-4119-bd14-c2f66ad7421d] Running
addons_test.go:856: (dbg) TestAddons/parallel/CloudSpanner: app=cloud-spanner-emulator healthy within 5.004667868s
addons_test.go:859: (dbg) Run:  out/minikube-darwin-amd64 addons disable cloud-spanner -p addons-080000
--- PASS: TestAddons/parallel/CloudSpanner (5.42s)

                                                
                                    
x
+
TestAddons/parallel/LocalPath (52.48s)

                                                
                                                
=== RUN   TestAddons/parallel/LocalPath
=== PAUSE TestAddons/parallel/LocalPath

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/LocalPath
addons_test.go:872: (dbg) Run:  kubectl --context addons-080000 apply -f testdata/storage-provisioner-rancher/pvc.yaml
addons_test.go:878: (dbg) Run:  kubectl --context addons-080000 apply -f testdata/storage-provisioner-rancher/pod.yaml
addons_test.go:882: (dbg) TestAddons/parallel/LocalPath: waiting 5m0s for pvc "test-pvc" in namespace "default" ...
helpers_test.go:394: (dbg) Run:  kubectl --context addons-080000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-080000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-080000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-080000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-080000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-080000 get pvc test-pvc -o jsonpath={.status.phase} -n default
addons_test.go:885: (dbg) TestAddons/parallel/LocalPath: waiting 3m0s for pods matching "run=test-local-path" in namespace "default" ...
helpers_test.go:344: "test-local-path" [ea4e1396-b563-42a4-a835-860c4665c493] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:344: "test-local-path" [ea4e1396-b563-42a4-a835-860c4665c493] Pending: Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
helpers_test.go:344: "test-local-path" [ea4e1396-b563-42a4-a835-860c4665c493] Succeeded: Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
addons_test.go:885: (dbg) TestAddons/parallel/LocalPath: run=test-local-path healthy within 4.002965254s
addons_test.go:890: (dbg) Run:  kubectl --context addons-080000 get pvc test-pvc -o=json
addons_test.go:899: (dbg) Run:  out/minikube-darwin-amd64 -p addons-080000 ssh "cat /opt/local-path-provisioner/pvc-5a5f7ad1-c20c-40e6-807d-426c6fc1d3cc_default_test-pvc/file1"
addons_test.go:911: (dbg) Run:  kubectl --context addons-080000 delete pod test-local-path
addons_test.go:915: (dbg) Run:  kubectl --context addons-080000 delete pvc test-pvc
addons_test.go:919: (dbg) Run:  out/minikube-darwin-amd64 -p addons-080000 addons disable storage-provisioner-rancher --alsologtostderr -v=1
addons_test.go:919: (dbg) Done: out/minikube-darwin-amd64 -p addons-080000 addons disable storage-provisioner-rancher --alsologtostderr -v=1: (42.7809125s)
--- PASS: TestAddons/parallel/LocalPath (52.48s)

                                                
                                    
x
+
TestAddons/parallel/NvidiaDevicePlugin (5.36s)

                                                
                                                
=== RUN   TestAddons/parallel/NvidiaDevicePlugin
=== PAUSE TestAddons/parallel/NvidiaDevicePlugin

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/NvidiaDevicePlugin
addons_test.go:951: (dbg) TestAddons/parallel/NvidiaDevicePlugin: waiting 6m0s for pods matching "name=nvidia-device-plugin-ds" in namespace "kube-system" ...
helpers_test.go:344: "nvidia-device-plugin-daemonset-46vct" [4519af08-9156-479b-9d45-c43f19c9f452] Running
addons_test.go:951: (dbg) TestAddons/parallel/NvidiaDevicePlugin: name=nvidia-device-plugin-ds healthy within 5.003792691s
addons_test.go:954: (dbg) Run:  out/minikube-darwin-amd64 addons disable nvidia-device-plugin -p addons-080000
--- PASS: TestAddons/parallel/NvidiaDevicePlugin (5.36s)

                                                
                                    
x
+
TestAddons/serial/GCPAuth/Namespaces (0.28s)

                                                
                                                
=== RUN   TestAddons/serial/GCPAuth/Namespaces
addons_test.go:649: (dbg) Run:  kubectl --context addons-080000 create ns new-namespace
addons_test.go:663: (dbg) Run:  kubectl --context addons-080000 get secret gcp-auth -n new-namespace
--- PASS: TestAddons/serial/GCPAuth/Namespaces (0.28s)

                                                
                                    
x
+
TestAddons/StoppedEnableDisable (5.77s)

                                                
                                                
=== RUN   TestAddons/StoppedEnableDisable
addons_test.go:171: (dbg) Run:  out/minikube-darwin-amd64 stop -p addons-080000
addons_test.go:171: (dbg) Done: out/minikube-darwin-amd64 stop -p addons-080000: (5.234773419s)
addons_test.go:175: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p addons-080000
addons_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 addons disable dashboard -p addons-080000
addons_test.go:184: (dbg) Run:  out/minikube-darwin-amd64 addons disable gvisor -p addons-080000
--- PASS: TestAddons/StoppedEnableDisable (5.77s)

                                                
                                    
x
+
TestCertOptions (40.37s)

                                                
                                                
=== RUN   TestCertOptions
=== PAUSE TestCertOptions

                                                
                                                

                                                
                                                
=== CONT  TestCertOptions
cert_options_test.go:49: (dbg) Run:  out/minikube-darwin-amd64 start -p cert-options-539000 --memory=2048 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=hyperkit 
E1218 15:10:28.680364    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
cert_options_test.go:49: (dbg) Done: out/minikube-darwin-amd64 start -p cert-options-539000 --memory=2048 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=hyperkit : (34.757506528s)
cert_options_test.go:60: (dbg) Run:  out/minikube-darwin-amd64 -p cert-options-539000 ssh "openssl x509 -text -noout -in /var/lib/minikube/certs/apiserver.crt"
cert_options_test.go:88: (dbg) Run:  kubectl --context cert-options-539000 config view
cert_options_test.go:100: (dbg) Run:  out/minikube-darwin-amd64 ssh -p cert-options-539000 -- "sudo cat /etc/kubernetes/admin.conf"
helpers_test.go:175: Cleaning up "cert-options-539000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p cert-options-539000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p cert-options-539000: (5.267280906s)
--- PASS: TestCertOptions (40.37s)

                                                
                                    
x
+
TestCertExpiration (247.56s)

                                                
                                                
=== RUN   TestCertExpiration
=== PAUSE TestCertExpiration

                                                
                                                

                                                
                                                
=== CONT  TestCertExpiration
cert_options_test.go:123: (dbg) Run:  out/minikube-darwin-amd64 start -p cert-expiration-868000 --memory=2048 --cert-expiration=3m --driver=hyperkit 
E1218 15:09:57.247210    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
E1218 15:10:02.065358    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
cert_options_test.go:123: (dbg) Done: out/minikube-darwin-amd64 start -p cert-expiration-868000 --memory=2048 --cert-expiration=3m --driver=hyperkit : (34.398698075s)
cert_options_test.go:131: (dbg) Run:  out/minikube-darwin-amd64 start -p cert-expiration-868000 --memory=2048 --cert-expiration=8760h --driver=hyperkit 
E1218 15:13:38.722025    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
E1218 15:13:39.012857    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
cert_options_test.go:131: (dbg) Done: out/minikube-darwin-amd64 start -p cert-expiration-868000 --memory=2048 --cert-expiration=8760h --driver=hyperkit : (27.890436984s)
helpers_test.go:175: Cleaning up "cert-expiration-868000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p cert-expiration-868000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p cert-expiration-868000: (5.267215417s)
--- PASS: TestCertExpiration (247.56s)

                                                
                                    
x
+
TestDockerFlags (49.71s)

                                                
                                                
=== RUN   TestDockerFlags
=== PAUSE TestDockerFlags

                                                
                                                

                                                
                                                
=== CONT  TestDockerFlags
docker_test.go:51: (dbg) Run:  out/minikube-darwin-amd64 start -p docker-flags-827000 --cache-images=false --memory=2048 --install-addons=false --wait=false --docker-env=FOO=BAR --docker-env=BAZ=BAT --docker-opt=debug --docker-opt=icc=true --alsologtostderr -v=5 --driver=hyperkit 
docker_test.go:51: (dbg) Done: out/minikube-darwin-amd64 start -p docker-flags-827000 --cache-images=false --memory=2048 --install-addons=false --wait=false --docker-env=FOO=BAR --docker-env=BAZ=BAT --docker-opt=debug --docker-opt=icc=true --alsologtostderr -v=5 --driver=hyperkit : (45.964506653s)
docker_test.go:56: (dbg) Run:  out/minikube-darwin-amd64 -p docker-flags-827000 ssh "sudo systemctl show docker --property=Environment --no-pager"
docker_test.go:67: (dbg) Run:  out/minikube-darwin-amd64 -p docker-flags-827000 ssh "sudo systemctl show docker --property=ExecStart --no-pager"
helpers_test.go:175: Cleaning up "docker-flags-827000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p docker-flags-827000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p docker-flags-827000: (3.410702923s)
--- PASS: TestDockerFlags (49.71s)

                                                
                                    
x
+
TestForceSystemdFlag (38.07s)

                                                
                                                
=== RUN   TestForceSystemdFlag
=== PAUSE TestForceSystemdFlag

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdFlag
docker_test.go:91: (dbg) Run:  out/minikube-darwin-amd64 start -p force-systemd-flag-278000 --memory=2048 --force-systemd --alsologtostderr -v=5 --driver=hyperkit 
docker_test.go:91: (dbg) Done: out/minikube-darwin-amd64 start -p force-systemd-flag-278000 --memory=2048 --force-systemd --alsologtostderr -v=5 --driver=hyperkit : (34.480964636s)
docker_test.go:110: (dbg) Run:  out/minikube-darwin-amd64 -p force-systemd-flag-278000 ssh "docker info --format {{.CgroupDriver}}"
helpers_test.go:175: Cleaning up "force-systemd-flag-278000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p force-systemd-flag-278000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p force-systemd-flag-278000: (3.409683143s)
--- PASS: TestForceSystemdFlag (38.07s)

                                                
                                    
x
+
TestForceSystemdEnv (38.89s)

                                                
                                                
=== RUN   TestForceSystemdEnv
=== PAUSE TestForceSystemdEnv

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdEnv
docker_test.go:155: (dbg) Run:  out/minikube-darwin-amd64 start -p force-systemd-env-946000 --memory=2048 --alsologtostderr -v=5 --driver=hyperkit 
E1218 15:08:39.011937    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
docker_test.go:155: (dbg) Done: out/minikube-darwin-amd64 start -p force-systemd-env-946000 --memory=2048 --alsologtostderr -v=5 --driver=hyperkit : (33.458464236s)
docker_test.go:110: (dbg) Run:  out/minikube-darwin-amd64 -p force-systemd-env-946000 ssh "docker info --format {{.CgroupDriver}}"
helpers_test.go:175: Cleaning up "force-systemd-env-946000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p force-systemd-env-946000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p force-systemd-env-946000: (5.268400306s)
--- PASS: TestForceSystemdEnv (38.89s)

                                                
                                    
x
+
TestHyperKitDriverInstallOrUpdate (7.56s)

                                                
                                                
=== RUN   TestHyperKitDriverInstallOrUpdate
=== PAUSE TestHyperKitDriverInstallOrUpdate

                                                
                                                

                                                
                                                
=== CONT  TestHyperKitDriverInstallOrUpdate
--- PASS: TestHyperKitDriverInstallOrUpdate (7.56s)

                                                
                                    
x
+
TestErrorSpam/setup (35.82s)

                                                
                                                
=== RUN   TestErrorSpam/setup
error_spam_test.go:81: (dbg) Run:  out/minikube-darwin-amd64 start -p nospam-207000 -n=1 --memory=2250 --wait=false --log_dir=/var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-207000 --driver=hyperkit 
error_spam_test.go:81: (dbg) Done: out/minikube-darwin-amd64 start -p nospam-207000 -n=1 --memory=2250 --wait=false --log_dir=/var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-207000 --driver=hyperkit : (35.819014155s)
--- PASS: TestErrorSpam/setup (35.82s)

                                                
                                    
x
+
TestErrorSpam/start (1.43s)

                                                
                                                
=== RUN   TestErrorSpam/start
error_spam_test.go:216: Cleaning up 1 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-207000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-207000 start --dry-run
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-207000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-207000 start --dry-run
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-207000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-207000 start --dry-run
--- PASS: TestErrorSpam/start (1.43s)

                                                
                                    
x
+
TestErrorSpam/status (0.51s)

                                                
                                                
=== RUN   TestErrorSpam/status
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-207000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-207000 status
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-207000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-207000 status
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-207000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-207000 status
--- PASS: TestErrorSpam/status (0.51s)

                                                
                                    
x
+
TestErrorSpam/pause (1.3s)

                                                
                                                
=== RUN   TestErrorSpam/pause
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-207000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-207000 pause
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-207000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-207000 pause
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-207000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-207000 pause
--- PASS: TestErrorSpam/pause (1.30s)

                                                
                                    
x
+
TestErrorSpam/unpause (1.32s)

                                                
                                                
=== RUN   TestErrorSpam/unpause
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-207000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-207000 unpause
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-207000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-207000 unpause
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-207000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-207000 unpause
--- PASS: TestErrorSpam/unpause (1.32s)

                                                
                                    
x
+
TestErrorSpam/stop (3.65s)

                                                
                                                
=== RUN   TestErrorSpam/stop
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-207000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-207000 stop
error_spam_test.go:159: (dbg) Done: out/minikube-darwin-amd64 -p nospam-207000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-207000 stop: (3.22897694s)
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-207000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-207000 stop
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-207000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-207000 stop
--- PASS: TestErrorSpam/stop (3.65s)

                                                
                                    
x
+
TestFunctional/serial/CopySyncFile (0.01s)

                                                
                                                
=== RUN   TestFunctional/serial/CopySyncFile
functional_test.go:1854: local sync path: /Users/jenkins/minikube-integration/17822-999/.minikube/files/etc/test/nested/copy/1483/hosts
--- PASS: TestFunctional/serial/CopySyncFile (0.01s)

                                                
                                    
x
+
TestFunctional/serial/StartWithProxy (51.68s)

                                                
                                                
=== RUN   TestFunctional/serial/StartWithProxy
functional_test.go:2233: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-821000 --memory=4000 --apiserver-port=8441 --wait=all --driver=hyperkit 
functional_test.go:2233: (dbg) Done: out/minikube-darwin-amd64 start -p functional-821000 --memory=4000 --apiserver-port=8441 --wait=all --driver=hyperkit : (51.675173097s)
--- PASS: TestFunctional/serial/StartWithProxy (51.68s)

                                                
                                    
x
+
TestFunctional/serial/AuditLog (0s)

                                                
                                                
=== RUN   TestFunctional/serial/AuditLog
--- PASS: TestFunctional/serial/AuditLog (0.00s)

                                                
                                    
x
+
TestFunctional/serial/SoftStart (40.89s)

                                                
                                                
=== RUN   TestFunctional/serial/SoftStart
functional_test.go:655: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-821000 --alsologtostderr -v=8
functional_test.go:655: (dbg) Done: out/minikube-darwin-amd64 start -p functional-821000 --alsologtostderr -v=8: (40.886988449s)
functional_test.go:659: soft start took 40.887433028s for "functional-821000" cluster.
--- PASS: TestFunctional/serial/SoftStart (40.89s)

                                                
                                    
x
+
TestFunctional/serial/KubeContext (0.04s)

                                                
                                                
=== RUN   TestFunctional/serial/KubeContext
functional_test.go:677: (dbg) Run:  kubectl config current-context
--- PASS: TestFunctional/serial/KubeContext (0.04s)

                                                
                                    
x
+
TestFunctional/serial/KubectlGetPods (0.06s)

                                                
                                                
=== RUN   TestFunctional/serial/KubectlGetPods
functional_test.go:692: (dbg) Run:  kubectl --context functional-821000 get po -A
--- PASS: TestFunctional/serial/KubectlGetPods (0.06s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_remote (3.26s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_remote
functional_test.go:1045: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 cache add registry.k8s.io/pause:3.1
functional_test.go:1045: (dbg) Done: out/minikube-darwin-amd64 -p functional-821000 cache add registry.k8s.io/pause:3.1: (1.209636448s)
functional_test.go:1045: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 cache add registry.k8s.io/pause:3.3
functional_test.go:1045: (dbg) Done: out/minikube-darwin-amd64 -p functional-821000 cache add registry.k8s.io/pause:3.3: (1.025245591s)
functional_test.go:1045: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 cache add registry.k8s.io/pause:latest
functional_test.go:1045: (dbg) Done: out/minikube-darwin-amd64 -p functional-821000 cache add registry.k8s.io/pause:latest: (1.022494878s)
--- PASS: TestFunctional/serial/CacheCmd/cache/add_remote (3.26s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_local (1.71s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_local
functional_test.go:1073: (dbg) Run:  docker build -t minikube-local-cache-test:functional-821000 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalserialCacheCmdcacheadd_local1381634415/001
functional_test.go:1085: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 cache add minikube-local-cache-test:functional-821000
functional_test.go:1090: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 cache delete minikube-local-cache-test:functional-821000
functional_test.go:1079: (dbg) Run:  docker rmi minikube-local-cache-test:functional-821000
--- PASS: TestFunctional/serial/CacheCmd/cache/add_local (1.71s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/CacheDelete (0.1s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/CacheDelete
functional_test.go:1098: (dbg) Run:  out/minikube-darwin-amd64 cache delete registry.k8s.io/pause:3.3
--- PASS: TestFunctional/serial/CacheCmd/cache/CacheDelete (0.10s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/list (0.08s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/list
functional_test.go:1106: (dbg) Run:  out/minikube-darwin-amd64 cache list
--- PASS: TestFunctional/serial/CacheCmd/cache/list (0.08s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.18s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node
functional_test.go:1120: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh sudo crictl images
--- PASS: TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.18s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/cache_reload (1.13s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/cache_reload
functional_test.go:1143: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh sudo docker rmi registry.k8s.io/pause:latest
functional_test.go:1149: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh sudo crictl inspecti registry.k8s.io/pause:latest
functional_test.go:1149: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-821000 ssh sudo crictl inspecti registry.k8s.io/pause:latest: exit status 1 (153.027338ms)

                                                
                                                
-- stdout --
	FATA[0000] no such image "registry.k8s.io/pause:latest" present 

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:1154: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 cache reload
functional_test.go:1159: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh sudo crictl inspecti registry.k8s.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/cache_reload (1.13s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/delete (0.16s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/delete
functional_test.go:1168: (dbg) Run:  out/minikube-darwin-amd64 cache delete registry.k8s.io/pause:3.1
functional_test.go:1168: (dbg) Run:  out/minikube-darwin-amd64 cache delete registry.k8s.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/delete (0.16s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmd (0.56s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmd
functional_test.go:712: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 kubectl -- --context functional-821000 get pods
--- PASS: TestFunctional/serial/MinikubeKubectlCmd (0.56s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmdDirectly (0.83s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmdDirectly
functional_test.go:737: (dbg) Run:  out/kubectl --context functional-821000 get pods
--- PASS: TestFunctional/serial/MinikubeKubectlCmdDirectly (0.83s)

                                                
                                    
x
+
TestFunctional/serial/ExtraConfig (39.3s)

                                                
                                                
=== RUN   TestFunctional/serial/ExtraConfig
functional_test.go:753: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-821000 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all
E1218 14:44:57.100563    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
E1218 14:44:57.107782    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
E1218 14:44:57.118484    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
E1218 14:44:57.138578    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
E1218 14:44:57.180674    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
E1218 14:44:57.260853    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
E1218 14:44:57.420945    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
E1218 14:44:57.741617    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
E1218 14:44:58.382326    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
E1218 14:44:59.662506    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
E1218 14:45:02.223679    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
E1218 14:45:07.344472    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
functional_test.go:753: (dbg) Done: out/minikube-darwin-amd64 start -p functional-821000 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all: (39.299939929s)
functional_test.go:757: restart took 39.300064742s for "functional-821000" cluster.
--- PASS: TestFunctional/serial/ExtraConfig (39.30s)

                                                
                                    
x
+
TestFunctional/serial/ComponentHealth (0.06s)

                                                
                                                
=== RUN   TestFunctional/serial/ComponentHealth
functional_test.go:806: (dbg) Run:  kubectl --context functional-821000 get po -l tier=control-plane -n kube-system -o=json
functional_test.go:821: etcd phase: Running
functional_test.go:831: etcd status: Ready
functional_test.go:821: kube-apiserver phase: Running
functional_test.go:831: kube-apiserver status: Ready
functional_test.go:821: kube-controller-manager phase: Running
functional_test.go:831: kube-controller-manager status: Ready
functional_test.go:821: kube-scheduler phase: Running
functional_test.go:831: kube-scheduler status: Ready
--- PASS: TestFunctional/serial/ComponentHealth (0.06s)

                                                
                                    
x
+
TestFunctional/serial/LogsCmd (2.88s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsCmd
functional_test.go:1232: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 logs
functional_test.go:1232: (dbg) Done: out/minikube-darwin-amd64 -p functional-821000 logs: (2.877305371s)
--- PASS: TestFunctional/serial/LogsCmd (2.88s)

                                                
                                    
x
+
TestFunctional/serial/LogsFileCmd (2.71s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsFileCmd
functional_test.go:1246: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 logs --file /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalserialLogsFileCmd1906512295/001/logs.txt
E1218 14:45:17.585135    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
functional_test.go:1246: (dbg) Done: out/minikube-darwin-amd64 -p functional-821000 logs --file /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalserialLogsFileCmd1906512295/001/logs.txt: (2.712183089s)
--- PASS: TestFunctional/serial/LogsFileCmd (2.71s)

                                                
                                    
x
+
TestFunctional/serial/InvalidService (4.4s)

                                                
                                                
=== RUN   TestFunctional/serial/InvalidService
functional_test.go:2320: (dbg) Run:  kubectl --context functional-821000 apply -f testdata/invalidsvc.yaml
functional_test.go:2334: (dbg) Run:  out/minikube-darwin-amd64 service invalid-svc -p functional-821000
functional_test.go:2334: (dbg) Non-zero exit: out/minikube-darwin-amd64 service invalid-svc -p functional-821000: exit status 115 (277.982906ms)

                                                
                                                
-- stdout --
	|-----------|-------------|-------------|--------------------------|
	| NAMESPACE |    NAME     | TARGET PORT |           URL            |
	|-----------|-------------|-------------|--------------------------|
	| default   | invalid-svc |          80 | http://192.169.0.5:31991 |
	|-----------|-------------|-------------|--------------------------|
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to SVC_UNREACHABLE: service not available: no running pod for service invalid-svc found
	* 
	╭────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                                                            │
	│    * If the above advice does not help, please let us know:                                                                │
	│      https://github.com/kubernetes/minikube/issues/new/choose                                                              │
	│                                                                                                                            │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.                                   │
	│    * Please also attach the following file to the GitHub issue:                                                            │
	│    * - /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube_service_96b204199e3191fa1740d4430b018a3c8028d52d_0.log    │
	│                                                                                                                            │
	╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
functional_test.go:2326: (dbg) Run:  kubectl --context functional-821000 delete -f testdata/invalidsvc.yaml
--- PASS: TestFunctional/serial/InvalidService (4.40s)

                                                
                                    
x
+
TestFunctional/parallel/ConfigCmd (0.53s)

                                                
                                                
=== RUN   TestFunctional/parallel/ConfigCmd
=== PAUSE TestFunctional/parallel/ConfigCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1195: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 config unset cpus
functional_test.go:1195: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 config get cpus
functional_test.go:1195: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-821000 config get cpus: exit status 14 (67.330891ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
functional_test.go:1195: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 config set cpus 2
functional_test.go:1195: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 config get cpus
functional_test.go:1195: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 config unset cpus
functional_test.go:1195: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 config get cpus
functional_test.go:1195: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-821000 config get cpus: exit status 14 (57.289646ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/ConfigCmd (0.53s)

                                                
                                    
x
+
TestFunctional/parallel/DashboardCmd (11.03s)

                                                
                                                
=== RUN   TestFunctional/parallel/DashboardCmd
=== PAUSE TestFunctional/parallel/DashboardCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DashboardCmd
functional_test.go:901: (dbg) daemon: [out/minikube-darwin-amd64 dashboard --url --port 36195 -p functional-821000 --alsologtostderr -v=1]
functional_test.go:906: (dbg) stopping [out/minikube-darwin-amd64 dashboard --url --port 36195 -p functional-821000 --alsologtostderr -v=1] ...
helpers_test.go:508: unable to kill pid 2773: os: process already finished
--- PASS: TestFunctional/parallel/DashboardCmd (11.03s)

                                                
                                    
x
+
TestFunctional/parallel/DryRun (0.97s)

                                                
                                                
=== RUN   TestFunctional/parallel/DryRun
=== PAUSE TestFunctional/parallel/DryRun

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DryRun
functional_test.go:970: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-821000 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit 
functional_test.go:970: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p functional-821000 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit : exit status 23 (501.339662ms)

                                                
                                                
-- stdout --
	* [functional-821000] minikube v1.32.0 on Darwin 14.2
	  - MINIKUBE_LOCATION=17822
	  - KUBECONFIG=/Users/jenkins/minikube-integration/17822-999/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17822-999/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on existing profile
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 14:46:24.080850    2724 out.go:296] Setting OutFile to fd 1 ...
	I1218 14:46:24.081053    2724 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1218 14:46:24.081058    2724 out.go:309] Setting ErrFile to fd 2...
	I1218 14:46:24.081063    2724 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1218 14:46:24.081256    2724 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17822-999/.minikube/bin
	I1218 14:46:24.083284    2724 out.go:303] Setting JSON to false
	I1218 14:46:24.108566    2724 start.go:128] hostinfo: {"hostname":"MacOS-Agent-2.local","uptime":955,"bootTime":1702938629,"procs":480,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.2","kernelVersion":"23.2.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"2965c349-98a5-5970-aaa9-9eedd3ae5959"}
	W1218 14:46:24.108666    2724 start.go:136] gopshost.Virtualization returned error: not implemented yet
	I1218 14:46:24.130351    2724 out.go:177] * [functional-821000] minikube v1.32.0 on Darwin 14.2
	I1218 14:46:24.172252    2724 out.go:177]   - MINIKUBE_LOCATION=17822
	I1218 14:46:24.172286    2724 notify.go:220] Checking for updates...
	I1218 14:46:24.214924    2724 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/17822-999/kubeconfig
	I1218 14:46:24.236178    2724 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I1218 14:46:24.257079    2724 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 14:46:24.277953    2724 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17822-999/.minikube
	I1218 14:46:24.299179    2724 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 14:46:24.320976    2724 config.go:182] Loaded profile config "functional-821000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
	I1218 14:46:24.321785    2724 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 14:46:24.321880    2724 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 14:46:24.331286    2724 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50541
	I1218 14:46:24.331650    2724 main.go:141] libmachine: () Calling .GetVersion
	I1218 14:46:24.332066    2724 main.go:141] libmachine: Using API Version  1
	I1218 14:46:24.332077    2724 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 14:46:24.332275    2724 main.go:141] libmachine: () Calling .GetMachineName
	I1218 14:46:24.332372    2724 main.go:141] libmachine: (functional-821000) Calling .DriverName
	I1218 14:46:24.332550    2724 driver.go:392] Setting default libvirt URI to qemu:///system
	I1218 14:46:24.332783    2724 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 14:46:24.332810    2724 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 14:46:24.340779    2724 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50543
	I1218 14:46:24.341128    2724 main.go:141] libmachine: () Calling .GetVersion
	I1218 14:46:24.341456    2724 main.go:141] libmachine: Using API Version  1
	I1218 14:46:24.341467    2724 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 14:46:24.341686    2724 main.go:141] libmachine: () Calling .GetMachineName
	I1218 14:46:24.341790    2724 main.go:141] libmachine: (functional-821000) Calling .DriverName
	I1218 14:46:24.371053    2724 out.go:177] * Using the hyperkit driver based on existing profile
	I1218 14:46:24.413092    2724 start.go:298] selected driver: hyperkit
	I1218 14:46:24.413104    2724 start.go:902] validating driver "hyperkit" against &{Name:functional-821000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/17806/minikube-v1.32.1-1702708929-17806-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1702920864-17822@sha256:4842b362f06b33d847d73f7ed166c93ce608f4c4cea49b711c7055fd50ebd1e0 Memory:4000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 Kubernet
esConfig:{KubernetesVersion:v1.28.4 ClusterName:functional-821000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8441 NodeName:} Nodes:[{Name: IP:192.169.0.5 Port:8441 KubernetesVersion:v1.28.4 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisk
s:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 AutoPauseInterval:1m0s GPUs:}
	I1218 14:46:24.413241    2724 start.go:913] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 14:46:24.437949    2724 out.go:177] 
	W1218 14:46:24.459239    2724 out.go:239] X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	I1218 14:46:24.480158    2724 out.go:177] 

                                                
                                                
** /stderr **
functional_test.go:987: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-821000 --dry-run --alsologtostderr -v=1 --driver=hyperkit 
--- PASS: TestFunctional/parallel/DryRun (0.97s)

                                                
                                    
x
+
TestFunctional/parallel/InternationalLanguage (0.49s)

                                                
                                                
=== RUN   TestFunctional/parallel/InternationalLanguage
=== PAUSE TestFunctional/parallel/InternationalLanguage

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/InternationalLanguage
functional_test.go:1016: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-821000 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit 
functional_test.go:1016: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p functional-821000 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit : exit status 23 (486.503412ms)

                                                
                                                
-- stdout --
	* [functional-821000] minikube v1.32.0 sur Darwin 14.2
	  - MINIKUBE_LOCATION=17822
	  - KUBECONFIG=/Users/jenkins/minikube-integration/17822-999/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17822-999/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Utilisation du pilote hyperkit basé sur le profil existant
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 14:46:25.041530    2740 out.go:296] Setting OutFile to fd 1 ...
	I1218 14:46:25.041728    2740 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1218 14:46:25.041735    2740 out.go:309] Setting ErrFile to fd 2...
	I1218 14:46:25.041739    2740 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1218 14:46:25.041976    2740 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17822-999/.minikube/bin
	I1218 14:46:25.043512    2740 out.go:303] Setting JSON to false
	I1218 14:46:25.067163    2740 start.go:128] hostinfo: {"hostname":"MacOS-Agent-2.local","uptime":956,"bootTime":1702938629,"procs":480,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.2","kernelVersion":"23.2.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"2965c349-98a5-5970-aaa9-9eedd3ae5959"}
	W1218 14:46:25.067280    2740 start.go:136] gopshost.Virtualization returned error: not implemented yet
	I1218 14:46:25.088396    2740 out.go:177] * [functional-821000] minikube v1.32.0 sur Darwin 14.2
	I1218 14:46:25.130305    2740 out.go:177]   - MINIKUBE_LOCATION=17822
	I1218 14:46:25.130346    2740 notify.go:220] Checking for updates...
	I1218 14:46:25.172146    2740 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/17822-999/kubeconfig
	I1218 14:46:25.193443    2740 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I1218 14:46:25.214479    2740 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 14:46:25.235305    2740 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17822-999/.minikube
	I1218 14:46:25.256398    2740 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 14:46:25.277972    2740 config.go:182] Loaded profile config "functional-821000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
	I1218 14:46:25.278657    2740 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 14:46:25.278730    2740 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 14:46:25.287894    2740 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50551
	I1218 14:46:25.288265    2740 main.go:141] libmachine: () Calling .GetVersion
	I1218 14:46:25.288690    2740 main.go:141] libmachine: Using API Version  1
	I1218 14:46:25.288700    2740 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 14:46:25.288907    2740 main.go:141] libmachine: () Calling .GetMachineName
	I1218 14:46:25.288999    2740 main.go:141] libmachine: (functional-821000) Calling .DriverName
	I1218 14:46:25.289194    2740 driver.go:392] Setting default libvirt URI to qemu:///system
	I1218 14:46:25.289431    2740 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 14:46:25.289453    2740 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 14:46:25.297380    2740 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50553
	I1218 14:46:25.297778    2740 main.go:141] libmachine: () Calling .GetVersion
	I1218 14:46:25.298131    2740 main.go:141] libmachine: Using API Version  1
	I1218 14:46:25.298143    2740 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 14:46:25.298368    2740 main.go:141] libmachine: () Calling .GetMachineName
	I1218 14:46:25.298488    2740 main.go:141] libmachine: (functional-821000) Calling .DriverName
	I1218 14:46:25.327381    2740 out.go:177] * Utilisation du pilote hyperkit basé sur le profil existant
	I1218 14:46:25.369382    2740 start.go:298] selected driver: hyperkit
	I1218 14:46:25.369404    2740 start.go:902] validating driver "hyperkit" against &{Name:functional-821000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/17806/minikube-v1.32.1-1702708929-17806-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1702920864-17822@sha256:4842b362f06b33d847d73f7ed166c93ce608f4c4cea49b711c7055fd50ebd1e0 Memory:4000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 Kubernet
esConfig:{KubernetesVersion:v1.28.4 ClusterName:functional-821000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8441 NodeName:} Nodes:[{Name: IP:192.169.0.5 Port:8441 KubernetesVersion:v1.28.4 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisk
s:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 AutoPauseInterval:1m0s GPUs:}
	I1218 14:46:25.369602    2740 start.go:913] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 14:46:25.395328    2740 out.go:177] 
	W1218 14:46:25.416538    2740 out.go:239] X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	I1218 14:46:25.437421    2740 out.go:177] 

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/InternationalLanguage (0.49s)

                                                
                                    
x
+
TestFunctional/parallel/StatusCmd (0.49s)

                                                
                                                
=== RUN   TestFunctional/parallel/StatusCmd
=== PAUSE TestFunctional/parallel/StatusCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/StatusCmd
functional_test.go:850: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 status
functional_test.go:856: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}
functional_test.go:868: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 status -o json
--- PASS: TestFunctional/parallel/StatusCmd (0.49s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmdConnect (7.59s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmdConnect
=== PAUSE TestFunctional/parallel/ServiceCmdConnect

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmdConnect
functional_test.go:1628: (dbg) Run:  kubectl --context functional-821000 create deployment hello-node-connect --image=registry.k8s.io/echoserver:1.8
functional_test.go:1634: (dbg) Run:  kubectl --context functional-821000 expose deployment hello-node-connect --type=NodePort --port=8080
functional_test.go:1639: (dbg) TestFunctional/parallel/ServiceCmdConnect: waiting 10m0s for pods matching "app=hello-node-connect" in namespace "default" ...
helpers_test.go:344: "hello-node-connect-55497b8b78-nxrxl" [37061c66-7460-44c0-b596-3a39549ef2e9] Pending / Ready:ContainersNotReady (containers with unready status: [echoserver]) / ContainersReady:ContainersNotReady (containers with unready status: [echoserver])
helpers_test.go:344: "hello-node-connect-55497b8b78-nxrxl" [37061c66-7460-44c0-b596-3a39549ef2e9] Running
functional_test.go:1639: (dbg) TestFunctional/parallel/ServiceCmdConnect: app=hello-node-connect healthy within 7.004255247s
functional_test.go:1648: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 service hello-node-connect --url
functional_test.go:1654: found endpoint for hello-node-connect: http://192.169.0.5:32532
functional_test.go:1674: http://192.169.0.5:32532: success! body:

                                                
                                                

                                                
                                                
Hostname: hello-node-connect-55497b8b78-nxrxl

                                                
                                                
Pod Information:
	-no pod information available-

                                                
                                                
Server values:
	server_version=nginx: 1.13.3 - lua: 10008

                                                
                                                
Request Information:
	client_address=10.244.0.1
	method=GET
	real path=/
	query=
	request_version=1.1
	request_uri=http://192.169.0.5:8080/

                                                
                                                
Request Headers:
	accept-encoding=gzip
	host=192.169.0.5:32532
	user-agent=Go-http-client/1.1

                                                
                                                
Request Body:
	-no body in request-

                                                
                                                
--- PASS: TestFunctional/parallel/ServiceCmdConnect (7.59s)

                                                
                                    
x
+
TestFunctional/parallel/AddonsCmd (0.26s)

                                                
                                                
=== RUN   TestFunctional/parallel/AddonsCmd
=== PAUSE TestFunctional/parallel/AddonsCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/AddonsCmd
functional_test.go:1689: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 addons list
functional_test.go:1701: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 addons list -o json
--- PASS: TestFunctional/parallel/AddonsCmd (0.26s)

                                                
                                    
x
+
TestFunctional/parallel/PersistentVolumeClaim (27.51s)

                                                
                                                
=== RUN   TestFunctional/parallel/PersistentVolumeClaim
=== PAUSE TestFunctional/parallel/PersistentVolumeClaim

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:44: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "integration-test=storage-provisioner" in namespace "kube-system" ...
helpers_test.go:344: "storage-provisioner" [f0d3d3a4-7b19-489d-9f51-746b7867ff3e] Running
functional_test_pvc_test.go:44: (dbg) TestFunctional/parallel/PersistentVolumeClaim: integration-test=storage-provisioner healthy within 5.003915181s
functional_test_pvc_test.go:49: (dbg) Run:  kubectl --context functional-821000 get storageclass -o=json
functional_test_pvc_test.go:69: (dbg) Run:  kubectl --context functional-821000 apply -f testdata/storage-provisioner/pvc.yaml
functional_test_pvc_test.go:76: (dbg) Run:  kubectl --context functional-821000 get pvc myclaim -o=json
functional_test_pvc_test.go:125: (dbg) Run:  kubectl --context functional-821000 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 3m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:344: "sp-pod" [49e23f59-4730-413b-bb61-b18519ce4a49] Pending
helpers_test.go:344: "sp-pod" [49e23f59-4730-413b-bb61-b18519ce4a49] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])
helpers_test.go:344: "sp-pod" [49e23f59-4730-413b-bb61-b18519ce4a49] Running
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 13.002400874s
functional_test_pvc_test.go:100: (dbg) Run:  kubectl --context functional-821000 exec sp-pod -- touch /tmp/mount/foo
functional_test_pvc_test.go:106: (dbg) Run:  kubectl --context functional-821000 delete -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:125: (dbg) Run:  kubectl --context functional-821000 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 3m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:344: "sp-pod" [9dbd0b18-86aa-4333-958f-bdc2177e81e8] Pending
helpers_test.go:344: "sp-pod" [9dbd0b18-86aa-4333-958f-bdc2177e81e8] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])
helpers_test.go:344: "sp-pod" [9dbd0b18-86aa-4333-958f-bdc2177e81e8] Running
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 8.038586712s
functional_test_pvc_test.go:114: (dbg) Run:  kubectl --context functional-821000 exec sp-pod -- ls /tmp/mount
--- PASS: TestFunctional/parallel/PersistentVolumeClaim (27.51s)

                                                
                                    
x
+
TestFunctional/parallel/SSHCmd (0.3s)

                                                
                                                
=== RUN   TestFunctional/parallel/SSHCmd
=== PAUSE TestFunctional/parallel/SSHCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/SSHCmd
functional_test.go:1724: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh "echo hello"
functional_test.go:1741: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh "cat /etc/hostname"
--- PASS: TestFunctional/parallel/SSHCmd (0.30s)

                                                
                                    
x
+
TestFunctional/parallel/CpCmd (1.1s)

                                                
                                                
=== RUN   TestFunctional/parallel/CpCmd
=== PAUSE TestFunctional/parallel/CpCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CpCmd
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 cp testdata/cp-test.txt /home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh -n functional-821000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 cp functional-821000:/home/docker/cp-test.txt /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelCpCmd755408319/001/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh -n functional-821000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 cp testdata/cp-test.txt /tmp/does/not/exist/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh -n functional-821000 "sudo cat /tmp/does/not/exist/cp-test.txt"
--- PASS: TestFunctional/parallel/CpCmd (1.10s)

                                                
                                    
x
+
TestFunctional/parallel/MySQL (26.13s)

                                                
                                                
=== RUN   TestFunctional/parallel/MySQL
=== PAUSE TestFunctional/parallel/MySQL

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/MySQL
functional_test.go:1792: (dbg) Run:  kubectl --context functional-821000 replace --force -f testdata/mysql.yaml
functional_test.go:1798: (dbg) TestFunctional/parallel/MySQL: waiting 10m0s for pods matching "app=mysql" in namespace "default" ...
helpers_test.go:344: "mysql-859648c796-n24pg" [c18feb2e-2b24-489e-824f-e600af393d05] Pending / Ready:ContainersNotReady (containers with unready status: [mysql]) / ContainersReady:ContainersNotReady (containers with unready status: [mysql])
helpers_test.go:344: "mysql-859648c796-n24pg" [c18feb2e-2b24-489e-824f-e600af393d05] Running
functional_test.go:1798: (dbg) TestFunctional/parallel/MySQL: app=mysql healthy within 23.004341883s
functional_test.go:1806: (dbg) Run:  kubectl --context functional-821000 exec mysql-859648c796-n24pg -- mysql -ppassword -e "show databases;"
functional_test.go:1806: (dbg) Non-zero exit: kubectl --context functional-821000 exec mysql-859648c796-n24pg -- mysql -ppassword -e "show databases;": exit status 1 (209.890458ms)

                                                
                                                
** stderr ** 
	mysql: [Warning] Using a password on the command line interface can be insecure.
	ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)
	command terminated with exit code 1

                                                
                                                
** /stderr **
functional_test.go:1806: (dbg) Run:  kubectl --context functional-821000 exec mysql-859648c796-n24pg -- mysql -ppassword -e "show databases;"
functional_test.go:1806: (dbg) Non-zero exit: kubectl --context functional-821000 exec mysql-859648c796-n24pg -- mysql -ppassword -e "show databases;": exit status 1 (107.488991ms)

                                                
                                                
** stderr ** 
	mysql: [Warning] Using a password on the command line interface can be insecure.
	ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)
	command terminated with exit code 1

                                                
                                                
** /stderr **
functional_test.go:1806: (dbg) Run:  kubectl --context functional-821000 exec mysql-859648c796-n24pg -- mysql -ppassword -e "show databases;"
--- PASS: TestFunctional/parallel/MySQL (26.13s)

                                                
                                    
x
+
TestFunctional/parallel/FileSync (0.22s)

                                                
                                                
=== RUN   TestFunctional/parallel/FileSync
=== PAUSE TestFunctional/parallel/FileSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/FileSync
functional_test.go:1928: Checking for existence of /etc/test/nested/copy/1483/hosts within VM
functional_test.go:1930: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh "sudo cat /etc/test/nested/copy/1483/hosts"
functional_test.go:1935: file sync test content: Test file for checking file sync process
--- PASS: TestFunctional/parallel/FileSync (0.22s)

                                                
                                    
x
+
TestFunctional/parallel/CertSync (1.22s)

                                                
                                                
=== RUN   TestFunctional/parallel/CertSync
=== PAUSE TestFunctional/parallel/CertSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CertSync
functional_test.go:1971: Checking for existence of /etc/ssl/certs/1483.pem within VM
functional_test.go:1972: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh "sudo cat /etc/ssl/certs/1483.pem"
functional_test.go:1971: Checking for existence of /usr/share/ca-certificates/1483.pem within VM
functional_test.go:1972: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh "sudo cat /usr/share/ca-certificates/1483.pem"
functional_test.go:1971: Checking for existence of /etc/ssl/certs/51391683.0 within VM
functional_test.go:1972: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh "sudo cat /etc/ssl/certs/51391683.0"
functional_test.go:1998: Checking for existence of /etc/ssl/certs/14832.pem within VM
functional_test.go:1999: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh "sudo cat /etc/ssl/certs/14832.pem"
functional_test.go:1998: Checking for existence of /usr/share/ca-certificates/14832.pem within VM
functional_test.go:1999: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh "sudo cat /usr/share/ca-certificates/14832.pem"
functional_test.go:1998: Checking for existence of /etc/ssl/certs/3ec20f2e.0 within VM
functional_test.go:1999: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh "sudo cat /etc/ssl/certs/3ec20f2e.0"
--- PASS: TestFunctional/parallel/CertSync (1.22s)

                                                
                                    
x
+
TestFunctional/parallel/NodeLabels (0.07s)

                                                
                                                
=== RUN   TestFunctional/parallel/NodeLabels
=== PAUSE TestFunctional/parallel/NodeLabels

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NodeLabels
functional_test.go:218: (dbg) Run:  kubectl --context functional-821000 get nodes --output=go-template "--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'"
--- PASS: TestFunctional/parallel/NodeLabels (0.07s)

                                                
                                    
x
+
TestFunctional/parallel/NonActiveRuntimeDisabled (0.21s)

                                                
                                                
=== RUN   TestFunctional/parallel/NonActiveRuntimeDisabled
=== PAUSE TestFunctional/parallel/NonActiveRuntimeDisabled

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NonActiveRuntimeDisabled
functional_test.go:2026: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh "sudo systemctl is-active crio"
functional_test.go:2026: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-821000 ssh "sudo systemctl is-active crio": exit status 1 (211.269948ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/NonActiveRuntimeDisabled (0.21s)

                                                
                                    
x
+
TestFunctional/parallel/License (0.48s)

                                                
                                                
=== RUN   TestFunctional/parallel/License
=== PAUSE TestFunctional/parallel/License

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/License
functional_test.go:2287: (dbg) Run:  out/minikube-darwin-amd64 license
--- PASS: TestFunctional/parallel/License (0.48s)

                                                
                                    
x
+
TestFunctional/parallel/Version/short (0.1s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/short
=== PAUSE TestFunctional/parallel/Version/short

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/short
functional_test.go:2255: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 version --short
--- PASS: TestFunctional/parallel/Version/short (0.10s)

                                                
                                    
x
+
TestFunctional/parallel/Version/components (0.52s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/components
=== PAUSE TestFunctional/parallel/Version/components

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/components
functional_test.go:2269: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 version -o=json --components
--- PASS: TestFunctional/parallel/Version/components (0.52s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListShort (0.19s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListShort
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListShort

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListShort
functional_test.go:260: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 image ls --format short --alsologtostderr
functional_test.go:265: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-821000 image ls --format short --alsologtostderr:
registry.k8s.io/pause:latest
registry.k8s.io/pause:3.9
registry.k8s.io/pause:3.3
registry.k8s.io/pause:3.1
registry.k8s.io/kube-scheduler:v1.28.4
registry.k8s.io/kube-proxy:v1.28.4
registry.k8s.io/kube-controller-manager:v1.28.4
registry.k8s.io/kube-apiserver:v1.28.4
registry.k8s.io/etcd:3.5.9-0
registry.k8s.io/echoserver:1.8
registry.k8s.io/coredns/coredns:v1.10.1
gcr.io/k8s-minikube/storage-provisioner:v5
gcr.io/k8s-minikube/busybox:1.28.4-glibc
gcr.io/google-containers/addon-resizer:functional-821000
docker.io/library/nginx:latest
docker.io/library/nginx:alpine
docker.io/library/mysql:5.7
docker.io/library/minikube-local-cache-test:functional-821000
functional_test.go:268: (dbg) Stderr: out/minikube-darwin-amd64 -p functional-821000 image ls --format short --alsologtostderr:
I1218 14:46:27.245343    2774 out.go:296] Setting OutFile to fd 1 ...
I1218 14:46:27.245755    2774 out.go:343] TERM=,COLORTERM=, which probably does not support color
I1218 14:46:27.245764    2774 out.go:309] Setting ErrFile to fd 2...
I1218 14:46:27.245769    2774 out.go:343] TERM=,COLORTERM=, which probably does not support color
I1218 14:46:27.245967    2774 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17822-999/.minikube/bin
I1218 14:46:27.246585    2774 config.go:182] Loaded profile config "functional-821000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
I1218 14:46:27.246684    2774 config.go:182] Loaded profile config "functional-821000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
I1218 14:46:27.247049    2774 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I1218 14:46:27.247111    2774 main.go:141] libmachine: Launching plugin server for driver hyperkit
I1218 14:46:27.255112    2774 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50600
I1218 14:46:27.255502    2774 main.go:141] libmachine: () Calling .GetVersion
I1218 14:46:27.255969    2774 main.go:141] libmachine: Using API Version  1
I1218 14:46:27.255980    2774 main.go:141] libmachine: () Calling .SetConfigRaw
I1218 14:46:27.256222    2774 main.go:141] libmachine: () Calling .GetMachineName
I1218 14:46:27.256335    2774 main.go:141] libmachine: (functional-821000) Calling .GetState
I1218 14:46:27.256431    2774 main.go:141] libmachine: (functional-821000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I1218 14:46:27.256504    2774 main.go:141] libmachine: (functional-821000) DBG | hyperkit pid from json: 2032
I1218 14:46:27.257873    2774 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I1218 14:46:27.257898    2774 main.go:141] libmachine: Launching plugin server for driver hyperkit
I1218 14:46:27.265904    2774 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50602
I1218 14:46:27.266337    2774 main.go:141] libmachine: () Calling .GetVersion
I1218 14:46:27.266663    2774 main.go:141] libmachine: Using API Version  1
I1218 14:46:27.266674    2774 main.go:141] libmachine: () Calling .SetConfigRaw
I1218 14:46:27.266919    2774 main.go:141] libmachine: () Calling .GetMachineName
I1218 14:46:27.267017    2774 main.go:141] libmachine: (functional-821000) Calling .DriverName
I1218 14:46:27.267180    2774 ssh_runner.go:195] Run: systemctl --version
I1218 14:46:27.267201    2774 main.go:141] libmachine: (functional-821000) Calling .GetSSHHostname
I1218 14:46:27.267274    2774 main.go:141] libmachine: (functional-821000) Calling .GetSSHPort
I1218 14:46:27.267350    2774 main.go:141] libmachine: (functional-821000) Calling .GetSSHKeyPath
I1218 14:46:27.267425    2774 main.go:141] libmachine: (functional-821000) Calling .GetSSHUsername
I1218 14:46:27.267513    2774 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17822-999/.minikube/machines/functional-821000/id_rsa Username:docker}
I1218 14:46:27.320314    2774 ssh_runner.go:195] Run: docker images --no-trunc --format "{{json .}}"
I1218 14:46:27.347051    2774 main.go:141] libmachine: Making call to close driver server
I1218 14:46:27.347060    2774 main.go:141] libmachine: (functional-821000) Calling .Close
I1218 14:46:27.347206    2774 main.go:141] libmachine: Successfully made call to close driver server
I1218 14:46:27.347217    2774 main.go:141] libmachine: Making call to close connection to plugin binary
I1218 14:46:27.347221    2774 main.go:141] libmachine: (functional-821000) DBG | Closing plugin on server side
I1218 14:46:27.347224    2774 main.go:141] libmachine: Making call to close driver server
I1218 14:46:27.347233    2774 main.go:141] libmachine: (functional-821000) Calling .Close
I1218 14:46:27.347372    2774 main.go:141] libmachine: Successfully made call to close driver server
I1218 14:46:27.347386    2774 main.go:141] libmachine: Making call to close connection to plugin binary
I1218 14:46:27.347401    2774 main.go:141] libmachine: (functional-821000) DBG | Closing plugin on server side
--- PASS: TestFunctional/parallel/ImageCommands/ImageListShort (0.19s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListTable (0.16s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListTable
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListTable

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListTable
functional_test.go:260: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 image ls --format table --alsologtostderr
functional_test.go:265: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-821000 image ls --format table --alsologtostderr:
|---------------------------------------------|-------------------|---------------|--------|
|                    Image                    |        Tag        |   Image ID    |  Size  |
|---------------------------------------------|-------------------|---------------|--------|
| registry.k8s.io/pause                       | 3.1               | da86e6ba6ca19 | 742kB  |
| registry.k8s.io/echoserver                  | 1.8               | 82e4c8a736a4f | 95.4MB |
| registry.k8s.io/pause                       | latest            | 350b164e7ae1d | 240kB  |
| registry.k8s.io/kube-proxy                  | v1.28.4           | 83f6cc407eed8 | 73.2MB |
| registry.k8s.io/pause                       | 3.9               | e6f1816883972 | 744kB  |
| gcr.io/k8s-minikube/busybox                 | latest            | beae173ccac6a | 1.24MB |
| gcr.io/google-containers/addon-resizer      | functional-821000 | ffd4cfbbe753e | 32.9MB |
| registry.k8s.io/coredns/coredns             | v1.10.1           | ead0a4a53df89 | 53.6MB |
| docker.io/library/mysql                     | 5.7               | 5107333e08a87 | 501MB  |
| docker.io/library/nginx                     | alpine            | 01e5c69afaf63 | 42.6MB |
| registry.k8s.io/kube-apiserver              | v1.28.4           | 7fe0e6f37db33 | 126MB  |
| registry.k8s.io/kube-scheduler              | v1.28.4           | e3db313c6dbc0 | 60.1MB |
| docker.io/localhost/my-image                | functional-821000 | 65055bd7305ab | 1.24MB |
| docker.io/library/minikube-local-cache-test | functional-821000 | b0df746d0010d | 30B    |
| docker.io/library/nginx                     | latest            | a6bd71f48f683 | 187MB  |
| docker.io/kubernetesui/metrics-scraper      | <none>            | 115053965e86b | 43.8MB |
| gcr.io/k8s-minikube/busybox                 | 1.28.4-glibc      | 56cc512116c8f | 4.4MB  |
| registry.k8s.io/kube-controller-manager     | v1.28.4           | d058aa5ab969c | 122MB  |
| registry.k8s.io/etcd                        | 3.5.9-0           | 73deb9a3f7025 | 294MB  |
| gcr.io/k8s-minikube/storage-provisioner     | v5                | 6e38f40d628db | 31.5MB |
| registry.k8s.io/pause                       | 3.3               | 0184c1613d929 | 683kB  |
|---------------------------------------------|-------------------|---------------|--------|
functional_test.go:268: (dbg) Stderr: out/minikube-darwin-amd64 -p functional-821000 image ls --format table --alsologtostderr:
I1218 14:46:30.123978    2800 out.go:296] Setting OutFile to fd 1 ...
I1218 14:46:30.124186    2800 out.go:343] TERM=,COLORTERM=, which probably does not support color
I1218 14:46:30.124191    2800 out.go:309] Setting ErrFile to fd 2...
I1218 14:46:30.124195    2800 out.go:343] TERM=,COLORTERM=, which probably does not support color
I1218 14:46:30.124389    2800 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17822-999/.minikube/bin
I1218 14:46:30.124988    2800 config.go:182] Loaded profile config "functional-821000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
I1218 14:46:30.125083    2800 config.go:182] Loaded profile config "functional-821000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
I1218 14:46:30.125478    2800 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I1218 14:46:30.125514    2800 main.go:141] libmachine: Launching plugin server for driver hyperkit
I1218 14:46:30.133315    2800 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50643
I1218 14:46:30.133738    2800 main.go:141] libmachine: () Calling .GetVersion
I1218 14:46:30.134158    2800 main.go:141] libmachine: Using API Version  1
I1218 14:46:30.134170    2800 main.go:141] libmachine: () Calling .SetConfigRaw
I1218 14:46:30.134422    2800 main.go:141] libmachine: () Calling .GetMachineName
I1218 14:46:30.134552    2800 main.go:141] libmachine: (functional-821000) Calling .GetState
I1218 14:46:30.134656    2800 main.go:141] libmachine: (functional-821000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I1218 14:46:30.134707    2800 main.go:141] libmachine: (functional-821000) DBG | hyperkit pid from json: 2032
I1218 14:46:30.136027    2800 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I1218 14:46:30.136051    2800 main.go:141] libmachine: Launching plugin server for driver hyperkit
I1218 14:46:30.144026    2800 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50645
I1218 14:46:30.144414    2800 main.go:141] libmachine: () Calling .GetVersion
I1218 14:46:30.144769    2800 main.go:141] libmachine: Using API Version  1
I1218 14:46:30.144780    2800 main.go:141] libmachine: () Calling .SetConfigRaw
I1218 14:46:30.144993    2800 main.go:141] libmachine: () Calling .GetMachineName
I1218 14:46:30.145095    2800 main.go:141] libmachine: (functional-821000) Calling .DriverName
I1218 14:46:30.145248    2800 ssh_runner.go:195] Run: systemctl --version
I1218 14:46:30.145271    2800 main.go:141] libmachine: (functional-821000) Calling .GetSSHHostname
I1218 14:46:30.145364    2800 main.go:141] libmachine: (functional-821000) Calling .GetSSHPort
I1218 14:46:30.145455    2800 main.go:141] libmachine: (functional-821000) Calling .GetSSHKeyPath
I1218 14:46:30.145531    2800 main.go:141] libmachine: (functional-821000) Calling .GetSSHUsername
I1218 14:46:30.145607    2800 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17822-999/.minikube/machines/functional-821000/id_rsa Username:docker}
I1218 14:46:30.182048    2800 ssh_runner.go:195] Run: docker images --no-trunc --format "{{json .}}"
I1218 14:46:30.205091    2800 main.go:141] libmachine: Making call to close driver server
I1218 14:46:30.205103    2800 main.go:141] libmachine: (functional-821000) Calling .Close
I1218 14:46:30.205300    2800 main.go:141] libmachine: Successfully made call to close driver server
I1218 14:46:30.205311    2800 main.go:141] libmachine: Making call to close connection to plugin binary
I1218 14:46:30.205321    2800 main.go:141] libmachine: Making call to close driver server
I1218 14:46:30.205337    2800 main.go:141] libmachine: (functional-821000) Calling .Close
I1218 14:46:30.205518    2800 main.go:141] libmachine: Successfully made call to close driver server
I1218 14:46:30.205526    2800 main.go:141] libmachine: Making call to close connection to plugin binary
I1218 14:46:30.205537    2800 main.go:141] libmachine: (functional-821000) DBG | Closing plugin on server side
2023/12/18 14:46:36 [DEBUG] GET http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/
--- PASS: TestFunctional/parallel/ImageCommands/ImageListTable (0.16s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListJson (0.16s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListJson
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListJson

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListJson
functional_test.go:260: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 image ls --format json --alsologtostderr
functional_test.go:265: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-821000 image ls --format json --alsologtostderr:
[{"id":"115053965e86b2df4d78af78d7951b8644839d20a03820c6df59a261103315f7","repoDigests":[],"repoTags":["docker.io/kubernetesui/metrics-scraper:\u003cnone\u003e"],"size":"43800000"},{"id":"56cc512116c8f894f11ce1995460aef1ee0972d48bc2a3bdb1faaac7c020289c","repoDigests":[],"repoTags":["gcr.io/k8s-minikube/busybox:1.28.4-glibc"],"size":"4400000"},{"id":"01e5c69afaf635f66aab0b59404a0ac72db1e2e519c3f41a1ff53d37c35bba41","repoDigests":[],"repoTags":["docker.io/library/nginx:alpine"],"size":"42600000"},{"id":"ead0a4a53df89fd173874b46093b6e62d8c72967bbf606d672c9e8c9b601a4fc","repoDigests":[],"repoTags":["registry.k8s.io/coredns/coredns:v1.10.1"],"size":"53600000"},{"id":"beae173ccac6ad749f76713cf4440fe3d21d1043fe616dfbe30775815d1d0f6a","repoDigests":[],"repoTags":["gcr.io/k8s-minikube/busybox:latest"],"size":"1240000"},{"id":"6e38f40d628db3002f5617342c8872c935de530d867d0f709a2fbda1a302a562","repoDigests":[],"repoTags":["gcr.io/k8s-minikube/storage-provisioner:v5"],"size":"31500000"},{"id":"ffd4cfbbe753e62419e1
29ee2ac618beb94e51baa7471df5038b0b516b59cf91","repoDigests":[],"repoTags":["gcr.io/google-containers/addon-resizer:functional-821000"],"size":"32900000"},{"id":"65055bd7305abae11fa7cba64cb9413633d4a8ccf102066ed525723cc2105b14","repoDigests":[],"repoTags":["docker.io/localhost/my-image:functional-821000"],"size":"1240000"},{"id":"a6bd71f48f6839d9faae1f29d3babef831e76bc213107682c5cc80f0cbb30866","repoDigests":[],"repoTags":["docker.io/library/nginx:latest"],"size":"187000000"},{"id":"7fe0e6f37db33464725e616a12ccc4e36970370005a2b09683a974db6350c257","repoDigests":[],"repoTags":["registry.k8s.io/kube-apiserver:v1.28.4"],"size":"126000000"},{"id":"e3db313c6dbc065d4ac3b32c7a6f2a878949031b881d217b63881a109c5cfba1","repoDigests":[],"repoTags":["registry.k8s.io/kube-scheduler:v1.28.4"],"size":"60100000"},{"id":"d058aa5ab969ce7b84d25e7188be1f80633b18db8ea7d02d9d0a78e676236591","repoDigests":[],"repoTags":["registry.k8s.io/kube-controller-manager:v1.28.4"],"size":"122000000"},{"id":"0184c1613d92931126feb4c548e5da1101551
3b9e4c104e7305ee8b53b50a9da","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.3"],"size":"683000"},{"id":"da86e6ba6ca197bf6bc5e9d900febd906b133eaa4750e6bed647b0fbe50ed43e","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.1"],"size":"742000"},{"id":"82e4c8a736a4fcf22b5ef9f6a4ff6207064c7187d7694bf97bd561605a538410","repoDigests":[],"repoTags":["registry.k8s.io/echoserver:1.8"],"size":"95400000"},{"id":"350b164e7ae1dcddeffadd65c76226c9b6dc5553f5179153fb0e36b78f2a5e06","repoDigests":[],"repoTags":["registry.k8s.io/pause:latest"],"size":"240000"},{"id":"b0df746d0010d95f56d022863004be8c330b6825a6ca3d54c9e23c3ec29718ae","repoDigests":[],"repoTags":["docker.io/library/minikube-local-cache-test:functional-821000"],"size":"30"},{"id":"5107333e08a87b836d48ff7528b1e84b9c86781cc9f1748bbc1b8c42a870d933","repoDigests":[],"repoTags":["docker.io/library/mysql:5.7"],"size":"501000000"},{"id":"83f6cc407eed88d214aad97f3539bde5c8e485ff14424cd021a3a2899304398e","repoDigests":[],"repoTags":["registry.k8s.io/kube-proxy:v1.
28.4"],"size":"73200000"},{"id":"73deb9a3f702532592a4167455f8bf2e5f5d900bcc959ba2fd2d35c321de1af9","repoDigests":[],"repoTags":["registry.k8s.io/etcd:3.5.9-0"],"size":"294000000"},{"id":"e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.9"],"size":"744000"}]
functional_test.go:268: (dbg) Stderr: out/minikube-darwin-amd64 -p functional-821000 image ls --format json --alsologtostderr:
I1218 14:46:29.966788    2796 out.go:296] Setting OutFile to fd 1 ...
I1218 14:46:29.967085    2796 out.go:343] TERM=,COLORTERM=, which probably does not support color
I1218 14:46:29.967090    2796 out.go:309] Setting ErrFile to fd 2...
I1218 14:46:29.967094    2796 out.go:343] TERM=,COLORTERM=, which probably does not support color
I1218 14:46:29.967275    2796 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17822-999/.minikube/bin
I1218 14:46:29.967884    2796 config.go:182] Loaded profile config "functional-821000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
I1218 14:46:29.967979    2796 config.go:182] Loaded profile config "functional-821000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
I1218 14:46:29.968390    2796 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I1218 14:46:29.968437    2796 main.go:141] libmachine: Launching plugin server for driver hyperkit
I1218 14:46:29.975946    2796 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50637
I1218 14:46:29.976345    2796 main.go:141] libmachine: () Calling .GetVersion
I1218 14:46:29.976763    2796 main.go:141] libmachine: Using API Version  1
I1218 14:46:29.976773    2796 main.go:141] libmachine: () Calling .SetConfigRaw
I1218 14:46:29.976969    2796 main.go:141] libmachine: () Calling .GetMachineName
I1218 14:46:29.977070    2796 main.go:141] libmachine: (functional-821000) Calling .GetState
I1218 14:46:29.977143    2796 main.go:141] libmachine: (functional-821000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I1218 14:46:29.977215    2796 main.go:141] libmachine: (functional-821000) DBG | hyperkit pid from json: 2032
I1218 14:46:29.978540    2796 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I1218 14:46:29.978578    2796 main.go:141] libmachine: Launching plugin server for driver hyperkit
I1218 14:46:29.986799    2796 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50639
I1218 14:46:29.987166    2796 main.go:141] libmachine: () Calling .GetVersion
I1218 14:46:29.987529    2796 main.go:141] libmachine: Using API Version  1
I1218 14:46:29.987549    2796 main.go:141] libmachine: () Calling .SetConfigRaw
I1218 14:46:29.987768    2796 main.go:141] libmachine: () Calling .GetMachineName
I1218 14:46:29.987874    2796 main.go:141] libmachine: (functional-821000) Calling .DriverName
I1218 14:46:29.988020    2796 ssh_runner.go:195] Run: systemctl --version
I1218 14:46:29.988041    2796 main.go:141] libmachine: (functional-821000) Calling .GetSSHHostname
I1218 14:46:29.988122    2796 main.go:141] libmachine: (functional-821000) Calling .GetSSHPort
I1218 14:46:29.988196    2796 main.go:141] libmachine: (functional-821000) Calling .GetSSHKeyPath
I1218 14:46:29.988275    2796 main.go:141] libmachine: (functional-821000) Calling .GetSSHUsername
I1218 14:46:29.988358    2796 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17822-999/.minikube/machines/functional-821000/id_rsa Username:docker}
I1218 14:46:30.023633    2796 ssh_runner.go:195] Run: docker images --no-trunc --format "{{json .}}"
I1218 14:46:30.042600    2796 main.go:141] libmachine: Making call to close driver server
I1218 14:46:30.042610    2796 main.go:141] libmachine: (functional-821000) Calling .Close
I1218 14:46:30.042783    2796 main.go:141] libmachine: Successfully made call to close driver server
I1218 14:46:30.042792    2796 main.go:141] libmachine: Making call to close connection to plugin binary
I1218 14:46:30.042798    2796 main.go:141] libmachine: Making call to close driver server
I1218 14:46:30.042803    2796 main.go:141] libmachine: (functional-821000) Calling .Close
I1218 14:46:30.042803    2796 main.go:141] libmachine: (functional-821000) DBG | Closing plugin on server side
I1218 14:46:30.042929    2796 main.go:141] libmachine: Successfully made call to close driver server
I1218 14:46:30.042941    2796 main.go:141] libmachine: Making call to close connection to plugin binary
I1218 14:46:30.042957    2796 main.go:141] libmachine: (functional-821000) DBG | Closing plugin on server side
--- PASS: TestFunctional/parallel/ImageCommands/ImageListJson (0.16s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListYaml (0.17s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListYaml
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListYaml

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListYaml
functional_test.go:260: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 image ls --format yaml --alsologtostderr
functional_test.go:265: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-821000 image ls --format yaml --alsologtostderr:
- id: 5107333e08a87b836d48ff7528b1e84b9c86781cc9f1748bbc1b8c42a870d933
repoDigests: []
repoTags:
- docker.io/library/mysql:5.7
size: "501000000"
- id: e3db313c6dbc065d4ac3b32c7a6f2a878949031b881d217b63881a109c5cfba1
repoDigests: []
repoTags:
- registry.k8s.io/kube-scheduler:v1.28.4
size: "60100000"
- id: b0df746d0010d95f56d022863004be8c330b6825a6ca3d54c9e23c3ec29718ae
repoDigests: []
repoTags:
- docker.io/library/minikube-local-cache-test:functional-821000
size: "30"
- id: 01e5c69afaf635f66aab0b59404a0ac72db1e2e519c3f41a1ff53d37c35bba41
repoDigests: []
repoTags:
- docker.io/library/nginx:alpine
size: "42600000"
- id: 7fe0e6f37db33464725e616a12ccc4e36970370005a2b09683a974db6350c257
repoDigests: []
repoTags:
- registry.k8s.io/kube-apiserver:v1.28.4
size: "126000000"
- id: ead0a4a53df89fd173874b46093b6e62d8c72967bbf606d672c9e8c9b601a4fc
repoDigests: []
repoTags:
- registry.k8s.io/coredns/coredns:v1.10.1
size: "53600000"
- id: e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.9
size: "744000"
- id: 6e38f40d628db3002f5617342c8872c935de530d867d0f709a2fbda1a302a562
repoDigests: []
repoTags:
- gcr.io/k8s-minikube/storage-provisioner:v5
size: "31500000"
- id: ffd4cfbbe753e62419e129ee2ac618beb94e51baa7471df5038b0b516b59cf91
repoDigests: []
repoTags:
- gcr.io/google-containers/addon-resizer:functional-821000
size: "32900000"
- id: 56cc512116c8f894f11ce1995460aef1ee0972d48bc2a3bdb1faaac7c020289c
repoDigests: []
repoTags:
- gcr.io/k8s-minikube/busybox:1.28.4-glibc
size: "4400000"
- id: 350b164e7ae1dcddeffadd65c76226c9b6dc5553f5179153fb0e36b78f2a5e06
repoDigests: []
repoTags:
- registry.k8s.io/pause:latest
size: "240000"
- id: a6bd71f48f6839d9faae1f29d3babef831e76bc213107682c5cc80f0cbb30866
repoDigests: []
repoTags:
- docker.io/library/nginx:latest
size: "187000000"
- id: d058aa5ab969ce7b84d25e7188be1f80633b18db8ea7d02d9d0a78e676236591
repoDigests: []
repoTags:
- registry.k8s.io/kube-controller-manager:v1.28.4
size: "122000000"
- id: 83f6cc407eed88d214aad97f3539bde5c8e485ff14424cd021a3a2899304398e
repoDigests: []
repoTags:
- registry.k8s.io/kube-proxy:v1.28.4
size: "73200000"
- id: 73deb9a3f702532592a4167455f8bf2e5f5d900bcc959ba2fd2d35c321de1af9
repoDigests: []
repoTags:
- registry.k8s.io/etcd:3.5.9-0
size: "294000000"
- id: 0184c1613d92931126feb4c548e5da11015513b9e4c104e7305ee8b53b50a9da
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.3
size: "683000"
- id: da86e6ba6ca197bf6bc5e9d900febd906b133eaa4750e6bed647b0fbe50ed43e
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.1
size: "742000"
- id: 82e4c8a736a4fcf22b5ef9f6a4ff6207064c7187d7694bf97bd561605a538410
repoDigests: []
repoTags:
- registry.k8s.io/echoserver:1.8
size: "95400000"

                                                
                                                
functional_test.go:268: (dbg) Stderr: out/minikube-darwin-amd64 -p functional-821000 image ls --format yaml --alsologtostderr:
I1218 14:46:27.430906    2778 out.go:296] Setting OutFile to fd 1 ...
I1218 14:46:27.431123    2778 out.go:343] TERM=,COLORTERM=, which probably does not support color
I1218 14:46:27.431130    2778 out.go:309] Setting ErrFile to fd 2...
I1218 14:46:27.431134    2778 out.go:343] TERM=,COLORTERM=, which probably does not support color
I1218 14:46:27.431337    2778 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17822-999/.minikube/bin
I1218 14:46:27.431958    2778 config.go:182] Loaded profile config "functional-821000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
I1218 14:46:27.432054    2778 config.go:182] Loaded profile config "functional-821000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
I1218 14:46:27.432418    2778 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I1218 14:46:27.432463    2778 main.go:141] libmachine: Launching plugin server for driver hyperkit
I1218 14:46:27.440647    2778 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50610
I1218 14:46:27.441119    2778 main.go:141] libmachine: () Calling .GetVersion
I1218 14:46:27.441593    2778 main.go:141] libmachine: Using API Version  1
I1218 14:46:27.441603    2778 main.go:141] libmachine: () Calling .SetConfigRaw
I1218 14:46:27.441840    2778 main.go:141] libmachine: () Calling .GetMachineName
I1218 14:46:27.441954    2778 main.go:141] libmachine: (functional-821000) Calling .GetState
I1218 14:46:27.442041    2778 main.go:141] libmachine: (functional-821000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I1218 14:46:27.442116    2778 main.go:141] libmachine: (functional-821000) DBG | hyperkit pid from json: 2032
I1218 14:46:27.443625    2778 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I1218 14:46:27.443655    2778 main.go:141] libmachine: Launching plugin server for driver hyperkit
I1218 14:46:27.452204    2778 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50612
I1218 14:46:27.452617    2778 main.go:141] libmachine: () Calling .GetVersion
I1218 14:46:27.452992    2778 main.go:141] libmachine: Using API Version  1
I1218 14:46:27.453006    2778 main.go:141] libmachine: () Calling .SetConfigRaw
I1218 14:46:27.453223    2778 main.go:141] libmachine: () Calling .GetMachineName
I1218 14:46:27.453348    2778 main.go:141] libmachine: (functional-821000) Calling .DriverName
I1218 14:46:27.453508    2778 ssh_runner.go:195] Run: systemctl --version
I1218 14:46:27.453530    2778 main.go:141] libmachine: (functional-821000) Calling .GetSSHHostname
I1218 14:46:27.453607    2778 main.go:141] libmachine: (functional-821000) Calling .GetSSHPort
I1218 14:46:27.453694    2778 main.go:141] libmachine: (functional-821000) Calling .GetSSHKeyPath
I1218 14:46:27.453788    2778 main.go:141] libmachine: (functional-821000) Calling .GetSSHUsername
I1218 14:46:27.453879    2778 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17822-999/.minikube/machines/functional-821000/id_rsa Username:docker}
I1218 14:46:27.492997    2778 ssh_runner.go:195] Run: docker images --no-trunc --format "{{json .}}"
I1218 14:46:27.515832    2778 main.go:141] libmachine: Making call to close driver server
I1218 14:46:27.515847    2778 main.go:141] libmachine: (functional-821000) Calling .Close
I1218 14:46:27.515985    2778 main.go:141] libmachine: Successfully made call to close driver server
I1218 14:46:27.515992    2778 main.go:141] libmachine: Making call to close connection to plugin binary
I1218 14:46:27.515998    2778 main.go:141] libmachine: Making call to close driver server
I1218 14:46:27.516005    2778 main.go:141] libmachine: (functional-821000) Calling .Close
I1218 14:46:27.516010    2778 main.go:141] libmachine: (functional-821000) DBG | Closing plugin on server side
I1218 14:46:27.516126    2778 main.go:141] libmachine: Successfully made call to close driver server
I1218 14:46:27.516138    2778 main.go:141] libmachine: Making call to close connection to plugin binary
I1218 14:46:27.516153    2778 main.go:141] libmachine: (functional-821000) DBG | Closing plugin on server side
--- PASS: TestFunctional/parallel/ImageCommands/ImageListYaml (0.17s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageBuild (2.37s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageBuild
=== PAUSE TestFunctional/parallel/ImageCommands/ImageBuild

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageBuild
functional_test.go:307: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh pgrep buildkitd
functional_test.go:307: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-821000 ssh pgrep buildkitd: exit status 1 (158.524462ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:314: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 image build -t localhost/my-image:functional-821000 testdata/build --alsologtostderr
functional_test.go:314: (dbg) Done: out/minikube-darwin-amd64 -p functional-821000 image build -t localhost/my-image:functional-821000 testdata/build --alsologtostderr: (1.992724877s)
functional_test.go:319: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-821000 image build -t localhost/my-image:functional-821000 testdata/build --alsologtostderr:
Sending build context to Docker daemon  3.072kB

Step 1/3 : FROM gcr.io/k8s-minikube/busybox
latest: Pulling from k8s-minikube/busybox
5cc84ad355aa: Pulling fs layer
5cc84ad355aa: Verifying Checksum
5cc84ad355aa: Download complete
5cc84ad355aa: Pull complete
Digest: sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
Status: Downloaded newer image for gcr.io/k8s-minikube/busybox:latest
---> beae173ccac6
Step 2/3 : RUN true
---> Running in 81d3fab0decb
Removing intermediate container 81d3fab0decb
---> a2ab81984bf8
Step 3/3 : ADD content.txt /
---> 65055bd7305a
Successfully built 65055bd7305a
Successfully tagged localhost/my-image:functional-821000
functional_test.go:322: (dbg) Stderr: out/minikube-darwin-amd64 -p functional-821000 image build -t localhost/my-image:functional-821000 testdata/build --alsologtostderr:
I1218 14:46:27.758021    2787 out.go:296] Setting OutFile to fd 1 ...
I1218 14:46:27.758336    2787 out.go:343] TERM=,COLORTERM=, which probably does not support color
I1218 14:46:27.758342    2787 out.go:309] Setting ErrFile to fd 2...
I1218 14:46:27.758346    2787 out.go:343] TERM=,COLORTERM=, which probably does not support color
I1218 14:46:27.758533    2787 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17822-999/.minikube/bin
I1218 14:46:27.759139    2787 config.go:182] Loaded profile config "functional-821000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
I1218 14:46:27.760458    2787 config.go:182] Loaded profile config "functional-821000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
I1218 14:46:27.760852    2787 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I1218 14:46:27.760897    2787 main.go:141] libmachine: Launching plugin server for driver hyperkit
I1218 14:46:27.768941    2787 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50624
I1218 14:46:27.769400    2787 main.go:141] libmachine: () Calling .GetVersion
I1218 14:46:27.769906    2787 main.go:141] libmachine: Using API Version  1
I1218 14:46:27.769920    2787 main.go:141] libmachine: () Calling .SetConfigRaw
I1218 14:46:27.770146    2787 main.go:141] libmachine: () Calling .GetMachineName
I1218 14:46:27.770249    2787 main.go:141] libmachine: (functional-821000) Calling .GetState
I1218 14:46:27.770338    2787 main.go:141] libmachine: (functional-821000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I1218 14:46:27.770410    2787 main.go:141] libmachine: (functional-821000) DBG | hyperkit pid from json: 2032
I1218 14:46:27.771741    2787 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I1218 14:46:27.771772    2787 main.go:141] libmachine: Launching plugin server for driver hyperkit
I1218 14:46:27.779901    2787 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50626
I1218 14:46:27.780282    2787 main.go:141] libmachine: () Calling .GetVersion
I1218 14:46:27.780623    2787 main.go:141] libmachine: Using API Version  1
I1218 14:46:27.780634    2787 main.go:141] libmachine: () Calling .SetConfigRaw
I1218 14:46:27.780855    2787 main.go:141] libmachine: () Calling .GetMachineName
I1218 14:46:27.780962    2787 main.go:141] libmachine: (functional-821000) Calling .DriverName
I1218 14:46:27.781110    2787 ssh_runner.go:195] Run: systemctl --version
I1218 14:46:27.781130    2787 main.go:141] libmachine: (functional-821000) Calling .GetSSHHostname
I1218 14:46:27.781219    2787 main.go:141] libmachine: (functional-821000) Calling .GetSSHPort
I1218 14:46:27.781329    2787 main.go:141] libmachine: (functional-821000) Calling .GetSSHKeyPath
I1218 14:46:27.781416    2787 main.go:141] libmachine: (functional-821000) Calling .GetSSHUsername
I1218 14:46:27.781500    2787 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17822-999/.minikube/machines/functional-821000/id_rsa Username:docker}
I1218 14:46:27.830780    2787 build_images.go:151] Building image from path: /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/build.1960083228.tar
I1218 14:46:27.830900    2787 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build
I1218 14:46:27.849679    2787 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/build/build.1960083228.tar
I1218 14:46:27.853431    2787 ssh_runner.go:352] existence check for /var/lib/minikube/build/build.1960083228.tar: stat -c "%s %y" /var/lib/minikube/build/build.1960083228.tar: Process exited with status 1
stdout:

                                                
                                                
stderr:
stat: cannot statx '/var/lib/minikube/build/build.1960083228.tar': No such file or directory
I1218 14:46:27.853472    2787 ssh_runner.go:362] scp /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/build.1960083228.tar --> /var/lib/minikube/build/build.1960083228.tar (3072 bytes)
I1218 14:46:27.877379    2787 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build/build.1960083228
I1218 14:46:27.895772    2787 ssh_runner.go:195] Run: sudo tar -C /var/lib/minikube/build/build.1960083228 -xf /var/lib/minikube/build/build.1960083228.tar
I1218 14:46:27.902245    2787 docker.go:346] Building image: /var/lib/minikube/build/build.1960083228
I1218 14:46:27.902320    2787 ssh_runner.go:195] Run: docker build -t localhost/my-image:functional-821000 /var/lib/minikube/build/build.1960083228
DEPRECATED: The legacy builder is deprecated and will be removed in a future release.
Install the buildx component to build images with BuildKit:
https://docs.docker.com/go/buildx/

                                                
                                                
I1218 14:46:29.630770    2787 ssh_runner.go:235] Completed: docker build -t localhost/my-image:functional-821000 /var/lib/minikube/build/build.1960083228: (1.728430021s)
I1218 14:46:29.630828    2787 ssh_runner.go:195] Run: sudo rm -rf /var/lib/minikube/build/build.1960083228
I1218 14:46:29.639212    2787 ssh_runner.go:195] Run: sudo rm -f /var/lib/minikube/build/build.1960083228.tar
I1218 14:46:29.647656    2787 build_images.go:207] Built localhost/my-image:functional-821000 from /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/build.1960083228.tar
I1218 14:46:29.647688    2787 build_images.go:123] succeeded building to: functional-821000
I1218 14:46:29.647695    2787 build_images.go:124] failed building to: 
I1218 14:46:29.647711    2787 main.go:141] libmachine: Making call to close driver server
I1218 14:46:29.647718    2787 main.go:141] libmachine: (functional-821000) Calling .Close
I1218 14:46:29.647888    2787 main.go:141] libmachine: (functional-821000) DBG | Closing plugin on server side
I1218 14:46:29.647914    2787 main.go:141] libmachine: Successfully made call to close driver server
I1218 14:46:29.647923    2787 main.go:141] libmachine: Making call to close connection to plugin binary
I1218 14:46:29.647932    2787 main.go:141] libmachine: Making call to close driver server
I1218 14:46:29.647940    2787 main.go:141] libmachine: (functional-821000) Calling .Close
I1218 14:46:29.648096    2787 main.go:141] libmachine: (functional-821000) DBG | Closing plugin on server side
I1218 14:46:29.648120    2787 main.go:141] libmachine: Successfully made call to close driver server
I1218 14:46:29.648127    2787 main.go:141] libmachine: Making call to close connection to plugin binary
functional_test.go:447: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageBuild (2.37s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/Setup (2.5s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/Setup
functional_test.go:341: (dbg) Run:  docker pull gcr.io/google-containers/addon-resizer:1.8.8
functional_test.go:341: (dbg) Done: docker pull gcr.io/google-containers/addon-resizer:1.8.8: (2.40825315s)
functional_test.go:346: (dbg) Run:  docker tag gcr.io/google-containers/addon-resizer:1.8.8 gcr.io/google-containers/addon-resizer:functional-821000
--- PASS: TestFunctional/parallel/ImageCommands/Setup (2.50s)

                                                
                                    
x
+
TestFunctional/parallel/DockerEnv/bash (0.92s)

                                                
                                                
=== RUN   TestFunctional/parallel/DockerEnv/bash
functional_test.go:495: (dbg) Run:  /bin/bash -c "eval $(out/minikube-darwin-amd64 -p functional-821000 docker-env) && out/minikube-darwin-amd64 status -p functional-821000"
functional_test.go:518: (dbg) Run:  /bin/bash -c "eval $(out/minikube-darwin-amd64 -p functional-821000 docker-env) && docker images"
--- PASS: TestFunctional/parallel/DockerEnv/bash (0.92s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_changes (0.22s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_changes
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_changes

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_changes
functional_test.go:2118: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_changes (0.22s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.25s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
functional_test.go:2118: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.25s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_clusters (0.25s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_clusters
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_clusters

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_clusters
functional_test.go:2118: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_clusters (0.25s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadDaemon (3.51s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadDaemon
functional_test.go:354: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 image load --daemon gcr.io/google-containers/addon-resizer:functional-821000 --alsologtostderr
functional_test.go:354: (dbg) Done: out/minikube-darwin-amd64 -p functional-821000 image load --daemon gcr.io/google-containers/addon-resizer:functional-821000 --alsologtostderr: (3.323787364s)
functional_test.go:447: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadDaemon (3.51s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageReloadDaemon (2.27s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageReloadDaemon
functional_test.go:364: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 image load --daemon gcr.io/google-containers/addon-resizer:functional-821000 --alsologtostderr
functional_test.go:364: (dbg) Done: out/minikube-darwin-amd64 -p functional-821000 image load --daemon gcr.io/google-containers/addon-resizer:functional-821000 --alsologtostderr: (2.093324679s)
functional_test.go:447: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageReloadDaemon (2.27s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (5.3s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon
functional_test.go:234: (dbg) Run:  docker pull gcr.io/google-containers/addon-resizer:1.8.9
functional_test.go:234: (dbg) Done: docker pull gcr.io/google-containers/addon-resizer:1.8.9: (1.834587921s)
functional_test.go:239: (dbg) Run:  docker tag gcr.io/google-containers/addon-resizer:1.8.9 gcr.io/google-containers/addon-resizer:functional-821000
functional_test.go:244: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 image load --daemon gcr.io/google-containers/addon-resizer:functional-821000 --alsologtostderr
E1218 14:45:38.066827    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
functional_test.go:244: (dbg) Done: out/minikube-darwin-amd64 -p functional-821000 image load --daemon gcr.io/google-containers/addon-resizer:functional-821000 --alsologtostderr: (3.175609738s)
functional_test.go:447: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (5.30s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveToFile (1.23s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveToFile
functional_test.go:379: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 image save gcr.io/google-containers/addon-resizer:functional-821000 /Users/jenkins/workspace/addon-resizer-save.tar --alsologtostderr
functional_test.go:379: (dbg) Done: out/minikube-darwin-amd64 -p functional-821000 image save gcr.io/google-containers/addon-resizer:functional-821000 /Users/jenkins/workspace/addon-resizer-save.tar --alsologtostderr: (1.232426667s)
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveToFile (1.23s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageRemove (0.35s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageRemove
functional_test.go:391: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 image rm gcr.io/google-containers/addon-resizer:functional-821000 --alsologtostderr
functional_test.go:447: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageRemove (0.35s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadFromFile (1.36s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadFromFile
functional_test.go:408: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 image load /Users/jenkins/workspace/addon-resizer-save.tar --alsologtostderr
functional_test.go:408: (dbg) Done: out/minikube-darwin-amd64 -p functional-821000 image load /Users/jenkins/workspace/addon-resizer-save.tar --alsologtostderr: (1.196890667s)
functional_test.go:447: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadFromFile (1.36s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveDaemon (1.3s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveDaemon
functional_test.go:418: (dbg) Run:  docker rmi gcr.io/google-containers/addon-resizer:functional-821000
functional_test.go:423: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 image save --daemon gcr.io/google-containers/addon-resizer:functional-821000 --alsologtostderr
functional_test.go:423: (dbg) Done: out/minikube-darwin-amd64 -p functional-821000 image save --daemon gcr.io/google-containers/addon-resizer:functional-821000 --alsologtostderr: (1.190088675s)
functional_test.go:428: (dbg) Run:  docker image inspect gcr.io/google-containers/addon-resizer:functional-821000
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveDaemon (1.30s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/DeployApp (13.12s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/DeployApp
functional_test.go:1438: (dbg) Run:  kubectl --context functional-821000 create deployment hello-node --image=registry.k8s.io/echoserver:1.8
functional_test.go:1444: (dbg) Run:  kubectl --context functional-821000 expose deployment hello-node --type=NodePort --port=8080
functional_test.go:1449: (dbg) TestFunctional/parallel/ServiceCmd/DeployApp: waiting 10m0s for pods matching "app=hello-node" in namespace "default" ...
helpers_test.go:344: "hello-node-d7447cc7f-trc9g" [776a7817-cfe6-4a5a-8035-d2dd5d3da6b6] Pending / Ready:ContainersNotReady (containers with unready status: [echoserver]) / ContainersReady:ContainersNotReady (containers with unready status: [echoserver])
helpers_test.go:344: "hello-node-d7447cc7f-trc9g" [776a7817-cfe6-4a5a-8035-d2dd5d3da6b6] Running
functional_test.go:1449: (dbg) TestFunctional/parallel/ServiceCmd/DeployApp: app=hello-node healthy within 13.00406709s
--- PASS: TestFunctional/parallel/ServiceCmd/DeployApp (13.12s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel (0.39s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-darwin-amd64 -p functional-821000 tunnel --alsologtostderr]
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-darwin-amd64 -p functional-821000 tunnel --alsologtostderr]
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-darwin-amd64 -p functional-821000 tunnel --alsologtostderr] ...
helpers_test.go:508: unable to kill pid 2476: os: process already finished
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-darwin-amd64 -p functional-821000 tunnel --alsologtostderr] ...
helpers_test.go:490: unable to find parent, assuming dead: process does not exist
--- PASS: TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel (0.39s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/StartTunnel
functional_test_tunnel_test.go:129: (dbg) daemon: [out/minikube-darwin-amd64 -p functional-821000 tunnel --alsologtostderr]
--- PASS: TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (10.17s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup
functional_test_tunnel_test.go:212: (dbg) Run:  kubectl --context functional-821000 apply -f testdata/testsvc.yaml
functional_test_tunnel_test.go:216: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: waiting 4m0s for pods matching "run=nginx-svc" in namespace "default" ...
helpers_test.go:344: "nginx-svc" [5a3cda3c-55c8-43bd-8f4a-49b413c9f6fc] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:344: "nginx-svc" [5a3cda3c-55c8-43bd-8f4a-49b413c9f6fc] Running
functional_test_tunnel_test.go:216: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: run=nginx-svc healthy within 10.004406963s
--- PASS: TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (10.17s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/List (0.38s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/List
functional_test.go:1458: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 service list
--- PASS: TestFunctional/parallel/ServiceCmd/List (0.38s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/JSONOutput (0.37s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/JSONOutput
functional_test.go:1488: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 service list -o json
functional_test.go:1493: Took "372.675079ms" to run "out/minikube-darwin-amd64 -p functional-821000 service list -o json"
--- PASS: TestFunctional/parallel/ServiceCmd/JSONOutput (0.37s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/HTTPS (0.25s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/HTTPS
functional_test.go:1508: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 service --namespace=default --https --url hello-node
functional_test.go:1521: found endpoint: https://192.169.0.5:31697
--- PASS: TestFunctional/parallel/ServiceCmd/HTTPS (0.25s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/Format (0.26s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/Format
functional_test.go:1539: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 service hello-node --url --format={{.IP}}
--- PASS: TestFunctional/parallel/ServiceCmd/Format (0.26s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/URL (0.26s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/URL
functional_test.go:1558: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 service hello-node --url
functional_test.go:1564: found endpoint for hello-node: http://192.169.0.5:31697
--- PASS: TestFunctional/parallel/ServiceCmd/URL (0.26s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP (0.05s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP
functional_test_tunnel_test.go:234: (dbg) Run:  kubectl --context functional-821000 get svc nginx-svc -o jsonpath={.status.loadBalancer.ingress[0].ip}
--- PASS: TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP (0.05s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessDirect
functional_test_tunnel_test.go:299: tunnel at http://10.111.245.136 is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0.04s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig
functional_test_tunnel_test.go:319: (dbg) Run:  dig +time=5 +tries=3 @10.96.0.10 nginx-svc.default.svc.cluster.local. A
functional_test_tunnel_test.go:327: DNS resolution by dig for nginx-svc.default.svc.cluster.local. is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0.04s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.03s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil
functional_test_tunnel_test.go:351: (dbg) Run:  dscacheutil -q host -a name nginx-svc.default.svc.cluster.local.
functional_test_tunnel_test.go:359: DNS resolution by dscacheutil for nginx-svc.default.svc.cluster.local. is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.03s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS
functional_test_tunnel_test.go:424: tunnel at http://nginx-svc.default.svc.cluster.local. is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.13s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel
functional_test_tunnel_test.go:434: (dbg) stopping [out/minikube-darwin-amd64 -p functional-821000 tunnel --alsologtostderr] ...
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.13s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_not_create (0.3s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_not_create
functional_test.go:1269: (dbg) Run:  out/minikube-darwin-amd64 profile lis
functional_test.go:1274: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestFunctional/parallel/ProfileCmd/profile_not_create (0.30s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_list (0.28s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_list
functional_test.go:1309: (dbg) Run:  out/minikube-darwin-amd64 profile list
functional_test.go:1314: Took "198.597568ms" to run "out/minikube-darwin-amd64 profile list"
functional_test.go:1323: (dbg) Run:  out/minikube-darwin-amd64 profile list -l
functional_test.go:1328: Took "77.437509ms" to run "out/minikube-darwin-amd64 profile list -l"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_list (0.28s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_json_output (0.27s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_json_output
functional_test.go:1360: (dbg) Run:  out/minikube-darwin-amd64 profile list -o json
functional_test.go:1365: Took "197.711468ms" to run "out/minikube-darwin-amd64 profile list -o json"
functional_test.go:1373: (dbg) Run:  out/minikube-darwin-amd64 profile list -o json --light
functional_test.go:1378: Took "76.02809ms" to run "out/minikube-darwin-amd64 profile list -o json --light"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_json_output (0.27s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/any-port (6.06s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/any-port
functional_test_mount_test.go:73: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-821000 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdany-port3763371701/001:/mount-9p --alsologtostderr -v=1]
functional_test_mount_test.go:107: wrote "test-1702939574953867000" to /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdany-port3763371701/001/created-by-test
functional_test_mount_test.go:107: wrote "test-1702939574953867000" to /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdany-port3763371701/001/created-by-test-removed-by-pod
functional_test_mount_test.go:107: wrote "test-1702939574953867000" to /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdany-port3763371701/001/test-1702939574953867000
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:115: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-821000 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (153.131838ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:129: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh -- ls -la /mount-9p
functional_test_mount_test.go:133: guest mount directory contents
total 2
-rw-r--r-- 1 docker docker 24 Dec 18 22:46 created-by-test
-rw-r--r-- 1 docker docker 24 Dec 18 22:46 created-by-test-removed-by-pod
-rw-r--r-- 1 docker docker 24 Dec 18 22:46 test-1702939574953867000
functional_test_mount_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh cat /mount-9p/test-1702939574953867000
functional_test_mount_test.go:148: (dbg) Run:  kubectl --context functional-821000 replace --force -f testdata/busybox-mount-test.yaml
functional_test_mount_test.go:153: (dbg) TestFunctional/parallel/MountCmd/any-port: waiting 4m0s for pods matching "integration-test=busybox-mount" in namespace "default" ...
helpers_test.go:344: "busybox-mount" [024a070c-6156-460e-8248-f7ac49e12ba2] Pending
helpers_test.go:344: "busybox-mount" [024a070c-6156-460e-8248-f7ac49e12ba2] Pending / Ready:ContainersNotReady (containers with unready status: [mount-munger]) / ContainersReady:ContainersNotReady (containers with unready status: [mount-munger])
helpers_test.go:344: "busybox-mount" [024a070c-6156-460e-8248-f7ac49e12ba2] Pending: Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
E1218 14:46:19.169918    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
helpers_test.go:344: "busybox-mount" [024a070c-6156-460e-8248-f7ac49e12ba2] Succeeded: Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
functional_test_mount_test.go:153: (dbg) TestFunctional/parallel/MountCmd/any-port: integration-test=busybox-mount healthy within 4.003071754s
functional_test_mount_test.go:169: (dbg) Run:  kubectl --context functional-821000 logs busybox-mount
functional_test_mount_test.go:181: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh stat /mount-9p/created-by-test
functional_test_mount_test.go:181: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh stat /mount-9p/created-by-pod
functional_test_mount_test.go:90: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:94: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-821000 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdany-port3763371701/001:/mount-9p --alsologtostderr -v=1] ...
--- PASS: TestFunctional/parallel/MountCmd/any-port (6.06s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/specific-port (1.62s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/specific-port
functional_test_mount_test.go:213: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-821000 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdspecific-port2963849214/001:/mount-9p --alsologtostderr -v=1 --port 46464]
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-821000 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (187.715728ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:257: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh -- ls -la /mount-9p
functional_test_mount_test.go:261: guest mount directory contents
total 0
functional_test_mount_test.go:263: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-821000 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdspecific-port2963849214/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
functional_test_mount_test.go:264: reading mount text
functional_test_mount_test.go:278: done reading mount text
functional_test_mount_test.go:230: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:230: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-821000 ssh "sudo umount -f /mount-9p": exit status 1 (129.081086ms)

                                                
                                                
-- stdout --
	umount: /mount-9p: not mounted.

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 32

                                                
                                                
** /stderr **
functional_test_mount_test.go:232: "out/minikube-darwin-amd64 -p functional-821000 ssh \"sudo umount -f /mount-9p\"": exit status 1
functional_test_mount_test.go:234: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-821000 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdspecific-port2963849214/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
--- PASS: TestFunctional/parallel/MountCmd/specific-port (1.62s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/VerifyCleanup (1.4s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/VerifyCleanup
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-821000 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdVerifyCleanup1515698451/001:/mount1 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-821000 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdVerifyCleanup1515698451/001:/mount2 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-821000 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdVerifyCleanup1515698451/001:/mount3 --alsologtostderr -v=1]
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-821000 ssh "findmnt -T" /mount1: exit status 1 (162.849873ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh "findmnt -T" /mount2
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p functional-821000 ssh "findmnt -T" /mount3
functional_test_mount_test.go:370: (dbg) Run:  out/minikube-darwin-amd64 mount -p functional-821000 --kill=true
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-821000 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdVerifyCleanup1515698451/001:/mount1 --alsologtostderr -v=1] ...
helpers_test.go:490: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-821000 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdVerifyCleanup1515698451/001:/mount2 --alsologtostderr -v=1] ...
helpers_test.go:490: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-821000 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdVerifyCleanup1515698451/001:/mount3 --alsologtostderr -v=1] ...
helpers_test.go:490: unable to find parent, assuming dead: process does not exist
--- PASS: TestFunctional/parallel/MountCmd/VerifyCleanup (1.40s)

                                                
                                    
x
+
TestFunctional/delete_addon-resizer_images (0.13s)

                                                
                                                
=== RUN   TestFunctional/delete_addon-resizer_images
functional_test.go:189: (dbg) Run:  docker rmi -f gcr.io/google-containers/addon-resizer:1.8.8
functional_test.go:189: (dbg) Run:  docker rmi -f gcr.io/google-containers/addon-resizer:functional-821000
--- PASS: TestFunctional/delete_addon-resizer_images (0.13s)

                                                
                                    
x
+
TestFunctional/delete_my-image_image (0.05s)

                                                
                                                
=== RUN   TestFunctional/delete_my-image_image
functional_test.go:197: (dbg) Run:  docker rmi -f localhost/my-image:functional-821000
--- PASS: TestFunctional/delete_my-image_image (0.05s)

                                                
                                    
x
+
TestFunctional/delete_minikube_cached_images (0.05s)

                                                
                                                
=== RUN   TestFunctional/delete_minikube_cached_images
functional_test.go:205: (dbg) Run:  docker rmi -f minikube-local-cache-test:functional-821000
--- PASS: TestFunctional/delete_minikube_cached_images (0.05s)

                                                
                                    
x
+
TestIngressAddonLegacy/StartLegacyK8sCluster (73.53s)

                                                
                                                
=== RUN   TestIngressAddonLegacy/StartLegacyK8sCluster
ingress_addon_legacy_test.go:39: (dbg) Run:  out/minikube-darwin-amd64 start -p ingress-addon-legacy-428000 --kubernetes-version=v1.18.20 --memory=4096 --wait=true --alsologtostderr -v=5 --driver=hyperkit 
E1218 14:47:41.091125    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
ingress_addon_legacy_test.go:39: (dbg) Done: out/minikube-darwin-amd64 start -p ingress-addon-legacy-428000 --kubernetes-version=v1.18.20 --memory=4096 --wait=true --alsologtostderr -v=5 --driver=hyperkit : (1m13.531031364s)
--- PASS: TestIngressAddonLegacy/StartLegacyK8sCluster (73.53s)

                                                
                                    
x
+
TestIngressAddonLegacy/serial/ValidateIngressAddonActivation (20.31s)

                                                
                                                
=== RUN   TestIngressAddonLegacy/serial/ValidateIngressAddonActivation
ingress_addon_legacy_test.go:70: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-428000 addons enable ingress --alsologtostderr -v=5
ingress_addon_legacy_test.go:70: (dbg) Done: out/minikube-darwin-amd64 -p ingress-addon-legacy-428000 addons enable ingress --alsologtostderr -v=5: (20.31156412s)
--- PASS: TestIngressAddonLegacy/serial/ValidateIngressAddonActivation (20.31s)

                                                
                                    
x
+
TestIngressAddonLegacy/serial/ValidateIngressDNSAddonActivation (0.53s)

                                                
                                                
=== RUN   TestIngressAddonLegacy/serial/ValidateIngressDNSAddonActivation
ingress_addon_legacy_test.go:79: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-428000 addons enable ingress-dns --alsologtostderr -v=5
--- PASS: TestIngressAddonLegacy/serial/ValidateIngressDNSAddonActivation (0.53s)

                                                
                                    
x
+
TestIngressAddonLegacy/serial/ValidateIngressAddons (36.23s)

                                                
                                                
=== RUN   TestIngressAddonLegacy/serial/ValidateIngressAddons
addons_test.go:206: (dbg) Run:  kubectl --context ingress-addon-legacy-428000 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s
addons_test.go:206: (dbg) Done: kubectl --context ingress-addon-legacy-428000 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s: (8.947566941s)
addons_test.go:231: (dbg) Run:  kubectl --context ingress-addon-legacy-428000 replace --force -f testdata/nginx-ingress-v1beta1.yaml
addons_test.go:244: (dbg) Run:  kubectl --context ingress-addon-legacy-428000 replace --force -f testdata/nginx-pod-svc.yaml
addons_test.go:249: (dbg) TestIngressAddonLegacy/serial/ValidateIngressAddons: waiting 8m0s for pods matching "run=nginx" in namespace "default" ...
helpers_test.go:344: "nginx" [c16786c9-0e53-4ab7-bbf2-c01ef768ae16] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:344: "nginx" [c16786c9-0e53-4ab7-bbf2-c01ef768ae16] Running
addons_test.go:249: (dbg) TestIngressAddonLegacy/serial/ValidateIngressAddons: run=nginx healthy within 9.003737161s
addons_test.go:261: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-428000 ssh "curl -s http://127.0.0.1/ -H 'Host: nginx.example.com'"
addons_test.go:285: (dbg) Run:  kubectl --context ingress-addon-legacy-428000 replace --force -f testdata/ingress-dns-example-v1beta1.yaml
addons_test.go:290: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-428000 ip
addons_test.go:296: (dbg) Run:  nslookup hello-john.test 192.169.0.7
addons_test.go:305: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-428000 addons disable ingress-dns --alsologtostderr -v=1
addons_test.go:305: (dbg) Done: out/minikube-darwin-amd64 -p ingress-addon-legacy-428000 addons disable ingress-dns --alsologtostderr -v=1: (10.056887063s)
addons_test.go:310: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-428000 addons disable ingress --alsologtostderr -v=1
addons_test.go:310: (dbg) Done: out/minikube-darwin-amd64 -p ingress-addon-legacy-428000 addons disable ingress --alsologtostderr -v=1: (7.276003472s)
--- PASS: TestIngressAddonLegacy/serial/ValidateIngressAddons (36.23s)

                                                
                                    
x
+
TestJSONOutput/start/Command (48.61s)

                                                
                                                
=== RUN   TestJSONOutput/start/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 start -p json-output-346000 --output=json --user=testUser --memory=2200 --wait=true --driver=hyperkit 
E1218 14:49:57.243020    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
json_output_test.go:63: (dbg) Done: out/minikube-darwin-amd64 start -p json-output-346000 --output=json --user=testUser --memory=2200 --wait=true --driver=hyperkit : (48.608554038s)
--- PASS: TestJSONOutput/start/Command (48.61s)

                                                
                                    
x
+
TestJSONOutput/start/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/Audit
--- PASS: TestJSONOutput/start/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/start/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/start/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/Command (0.46s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 pause -p json-output-346000 --output=json --user=testUser
--- PASS: TestJSONOutput/pause/Command (0.46s)

                                                
                                    
x
+
TestJSONOutput/pause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Audit
--- PASS: TestJSONOutput/pause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/Command (0.43s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 unpause -p json-output-346000 --output=json --user=testUser
--- PASS: TestJSONOutput/unpause/Command (0.43s)

                                                
                                    
x
+
TestJSONOutput/unpause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Audit
--- PASS: TestJSONOutput/unpause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/Command (8.16s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 stop -p json-output-346000 --output=json --user=testUser
json_output_test.go:63: (dbg) Done: out/minikube-darwin-amd64 stop -p json-output-346000 --output=json --user=testUser: (8.159628703s)
--- PASS: TestJSONOutput/stop/Command (8.16s)

                                                
                                    
x
+
TestJSONOutput/stop/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Audit
--- PASS: TestJSONOutput/stop/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestErrorJSONOutput (0.77s)

                                                
                                                
=== RUN   TestErrorJSONOutput
json_output_test.go:160: (dbg) Run:  out/minikube-darwin-amd64 start -p json-output-error-033000 --memory=2200 --output=json --wait=true --driver=fail
json_output_test.go:160: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p json-output-error-033000 --memory=2200 --output=json --wait=true --driver=fail: exit status 56 (382.810414ms)

                                                
                                                
-- stdout --
	{"specversion":"1.0","id":"b41e1745-0525-43c1-b4c6-375a07de1aab","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"0","message":"[json-output-error-033000] minikube v1.32.0 on Darwin 14.2","name":"Initial Minikube Setup","totalsteps":"19"}}
	{"specversion":"1.0","id":"6b8542b8-178d-42d8-b0b7-a783ee4932f8","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_LOCATION=17822"}}
	{"specversion":"1.0","id":"7e13e1c4-cdbc-4f26-a7d4-104ff672b396","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"KUBECONFIG=/Users/jenkins/minikube-integration/17822-999/kubeconfig"}}
	{"specversion":"1.0","id":"93dbfb3e-a3f3-45f3-b316-342e9877b7d6","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_BIN=out/minikube-darwin-amd64"}}
	{"specversion":"1.0","id":"f6b1b89f-43b3-4019-8ee9-b085511549bf","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true"}}
	{"specversion":"1.0","id":"58741d18-bb5f-4da1-9cd5-43bb816b9e02","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_HOME=/Users/jenkins/minikube-integration/17822-999/.minikube"}}
	{"specversion":"1.0","id":"3482aeeb-6dcc-4a68-a840-815d217561f4","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_FORCE_SYSTEMD="}}
	{"specversion":"1.0","id":"4d648658-88f2-4851-959d-ec3f8589761b","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.error","datacontenttype":"application/json","data":{"advice":"","exitcode":"56","issues":"","message":"The driver 'fail' is not supported on darwin/amd64","name":"DRV_UNSUPPORTED_OS","url":""}}

                                                
                                                
-- /stdout --
helpers_test.go:175: Cleaning up "json-output-error-033000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p json-output-error-033000
--- PASS: TestErrorJSONOutput (0.77s)

                                                
                                    
x
+
TestMainNoArgs (0.08s)

                                                
                                                
=== RUN   TestMainNoArgs
main_test.go:68: (dbg) Run:  out/minikube-darwin-amd64
--- PASS: TestMainNoArgs (0.08s)

                                                
                                    
x
+
TestMinikubeProfile (86.9s)

                                                
                                                
=== RUN   TestMinikubeProfile
minikube_profile_test.go:44: (dbg) Run:  out/minikube-darwin-amd64 start -p first-433000 --driver=hyperkit 
E1218 14:50:24.933806    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
E1218 14:50:28.677047    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
E1218 14:50:28.682490    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
E1218 14:50:28.692569    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
E1218 14:50:28.712697    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
E1218 14:50:28.754546    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
E1218 14:50:28.835816    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
E1218 14:50:28.997567    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
E1218 14:50:29.319321    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
E1218 14:50:29.960326    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
E1218 14:50:31.240609    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
E1218 14:50:33.800940    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
E1218 14:50:38.921705    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
E1218 14:50:49.163027    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
minikube_profile_test.go:44: (dbg) Done: out/minikube-darwin-amd64 start -p first-433000 --driver=hyperkit : (38.748868959s)
minikube_profile_test.go:44: (dbg) Run:  out/minikube-darwin-amd64 start -p second-435000 --driver=hyperkit 
E1218 14:51:09.643330    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
minikube_profile_test.go:44: (dbg) Done: out/minikube-darwin-amd64 start -p second-435000 --driver=hyperkit : (36.736073856s)
minikube_profile_test.go:51: (dbg) Run:  out/minikube-darwin-amd64 profile first-433000
minikube_profile_test.go:55: (dbg) Run:  out/minikube-darwin-amd64 profile list -ojson
minikube_profile_test.go:51: (dbg) Run:  out/minikube-darwin-amd64 profile second-435000
minikube_profile_test.go:55: (dbg) Run:  out/minikube-darwin-amd64 profile list -ojson
helpers_test.go:175: Cleaning up "second-435000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p second-435000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p second-435000: (5.266286157s)
helpers_test.go:175: Cleaning up "first-433000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p first-433000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p first-433000: (5.309394457s)
--- PASS: TestMinikubeProfile (86.90s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountFirst (16.14s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountFirst
mount_start_test.go:98: (dbg) Run:  out/minikube-darwin-amd64 start -p mount-start-1-748000 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=hyperkit 
E1218 14:51:50.604293    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
mount_start_test.go:98: (dbg) Done: out/minikube-darwin-amd64 start -p mount-start-1-748000 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=hyperkit : (15.138928891s)
--- PASS: TestMountStart/serial/StartWithMountFirst (16.14s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountFirst (0.32s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountFirst
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-1-748000 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-1-748000 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountFirst (0.32s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountSecond (18.16s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountSecond
mount_start_test.go:98: (dbg) Run:  out/minikube-darwin-amd64 start -p mount-start-2-768000 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=hyperkit 
mount_start_test.go:98: (dbg) Done: out/minikube-darwin-amd64 start -p mount-start-2-768000 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=hyperkit : (17.160561949s)
--- PASS: TestMountStart/serial/StartWithMountSecond (18.16s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountSecond (0.33s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountSecond
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-768000 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-768000 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountSecond (0.33s)

                                                
                                    
x
+
TestMountStart/serial/DeleteFirst (2.37s)

                                                
                                                
=== RUN   TestMountStart/serial/DeleteFirst
pause_test.go:132: (dbg) Run:  out/minikube-darwin-amd64 delete -p mount-start-1-748000 --alsologtostderr -v=5
pause_test.go:132: (dbg) Done: out/minikube-darwin-amd64 delete -p mount-start-1-748000 --alsologtostderr -v=5: (2.37349009s)
--- PASS: TestMountStart/serial/DeleteFirst (2.37s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostDelete (0.32s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostDelete
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-768000 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-768000 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountPostDelete (0.32s)

                                                
                                    
x
+
TestMountStart/serial/Stop (2.23s)

                                                
                                                
=== RUN   TestMountStart/serial/Stop
mount_start_test.go:155: (dbg) Run:  out/minikube-darwin-amd64 stop -p mount-start-2-768000
mount_start_test.go:155: (dbg) Done: out/minikube-darwin-amd64 stop -p mount-start-2-768000: (2.234092561s)
--- PASS: TestMountStart/serial/Stop (2.23s)

                                                
                                    
x
+
TestMountStart/serial/RestartStopped (16.71s)

                                                
                                                
=== RUN   TestMountStart/serial/RestartStopped
mount_start_test.go:166: (dbg) Run:  out/minikube-darwin-amd64 start -p mount-start-2-768000
mount_start_test.go:166: (dbg) Done: out/minikube-darwin-amd64 start -p mount-start-2-768000: (15.713709655s)
--- PASS: TestMountStart/serial/RestartStopped (16.71s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostStop (0.31s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostStop
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-768000 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-768000 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountPostStop (0.31s)

                                                
                                    
x
+
TestMultiNode/serial/FreshStart2Nodes (154.75s)

                                                
                                                
=== RUN   TestMultiNode/serial/FreshStart2Nodes
multinode_test.go:86: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-880000 --wait=true --memory=2200 --nodes=2 -v=8 --alsologtostderr --driver=hyperkit 
E1218 14:53:12.525671    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
E1218 14:53:39.010316    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
E1218 14:53:39.016424    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
E1218 14:53:39.027650    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
E1218 14:53:39.049799    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
E1218 14:53:39.090117    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
E1218 14:53:39.171730    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
E1218 14:53:39.333197    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
E1218 14:53:39.653978    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
E1218 14:53:40.294197    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
E1218 14:53:41.574739    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
E1218 14:53:44.135759    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
E1218 14:53:49.255985    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
E1218 14:53:59.496162    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
E1218 14:54:19.976584    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
E1218 14:54:57.244308    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
E1218 14:55:00.938891    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
multinode_test.go:86: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-880000 --wait=true --memory=2200 --nodes=2 -v=8 --alsologtostderr --driver=hyperkit : (2m34.507648745s)
multinode_test.go:92: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 status --alsologtostderr
--- PASS: TestMultiNode/serial/FreshStart2Nodes (154.75s)

                                                
                                    
x
+
TestMultiNode/serial/DeployApp2Nodes (4.92s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeployApp2Nodes
multinode_test.go:509: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-880000 -- apply -f ./testdata/multinodes/multinode-pod-dns-test.yaml
multinode_test.go:514: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-880000 -- rollout status deployment/busybox
multinode_test.go:514: (dbg) Done: out/minikube-darwin-amd64 kubectl -p multinode-880000 -- rollout status deployment/busybox: (3.14643526s)
multinode_test.go:521: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-880000 -- get pods -o jsonpath='{.items[*].status.podIP}'
multinode_test.go:544: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-880000 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:552: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-880000 -- exec busybox-5bc68d56bd-q5k4v -- nslookup kubernetes.io
multinode_test.go:552: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-880000 -- exec busybox-5bc68d56bd-zbjpt -- nslookup kubernetes.io
multinode_test.go:562: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-880000 -- exec busybox-5bc68d56bd-q5k4v -- nslookup kubernetes.default
multinode_test.go:562: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-880000 -- exec busybox-5bc68d56bd-zbjpt -- nslookup kubernetes.default
multinode_test.go:570: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-880000 -- exec busybox-5bc68d56bd-q5k4v -- nslookup kubernetes.default.svc.cluster.local
multinode_test.go:570: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-880000 -- exec busybox-5bc68d56bd-zbjpt -- nslookup kubernetes.default.svc.cluster.local
--- PASS: TestMultiNode/serial/DeployApp2Nodes (4.92s)

                                                
                                    
x
+
TestMultiNode/serial/PingHostFrom2Pods (0.91s)

                                                
                                                
=== RUN   TestMultiNode/serial/PingHostFrom2Pods
multinode_test.go:580: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-880000 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:588: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-880000 -- exec busybox-5bc68d56bd-q5k4v -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:599: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-880000 -- exec busybox-5bc68d56bd-q5k4v -- sh -c "ping -c 1 192.169.0.1"
multinode_test.go:588: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-880000 -- exec busybox-5bc68d56bd-zbjpt -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:599: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-880000 -- exec busybox-5bc68d56bd-zbjpt -- sh -c "ping -c 1 192.169.0.1"
--- PASS: TestMultiNode/serial/PingHostFrom2Pods (0.91s)

                                                
                                    
x
+
TestMultiNode/serial/AddNode (32.17s)

                                                
                                                
=== RUN   TestMultiNode/serial/AddNode
multinode_test.go:111: (dbg) Run:  out/minikube-darwin-amd64 node add -p multinode-880000 -v 3 --alsologtostderr
E1218 14:55:28.676723    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
E1218 14:55:56.367800    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
multinode_test.go:111: (dbg) Done: out/minikube-darwin-amd64 node add -p multinode-880000 -v 3 --alsologtostderr: (31.839953692s)
multinode_test.go:117: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 status --alsologtostderr
--- PASS: TestMultiNode/serial/AddNode (32.17s)

                                                
                                    
x
+
TestMultiNode/serial/MultiNodeLabels (0.05s)

                                                
                                                
=== RUN   TestMultiNode/serial/MultiNodeLabels
multinode_test.go:211: (dbg) Run:  kubectl --context multinode-880000 get nodes -o "jsonpath=[{range .items[*]}{.metadata.labels},{end}]"
--- PASS: TestMultiNode/serial/MultiNodeLabels (0.05s)

                                                
                                    
x
+
TestMultiNode/serial/ProfileList (0.2s)

                                                
                                                
=== RUN   TestMultiNode/serial/ProfileList
multinode_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestMultiNode/serial/ProfileList (0.20s)

                                                
                                    
x
+
TestMultiNode/serial/CopyFile (5.45s)

                                                
                                                
=== RUN   TestMultiNode/serial/CopyFile
multinode_test.go:174: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 status --output json --alsologtostderr
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 cp testdata/cp-test.txt multinode-880000:/home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 ssh -n multinode-880000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 cp multinode-880000:/home/docker/cp-test.txt /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestMultiNodeserialCopyFile382742653/001/cp-test_multinode-880000.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 ssh -n multinode-880000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 cp multinode-880000:/home/docker/cp-test.txt multinode-880000-m02:/home/docker/cp-test_multinode-880000_multinode-880000-m02.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 ssh -n multinode-880000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 ssh -n multinode-880000-m02 "sudo cat /home/docker/cp-test_multinode-880000_multinode-880000-m02.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 cp multinode-880000:/home/docker/cp-test.txt multinode-880000-m03:/home/docker/cp-test_multinode-880000_multinode-880000-m03.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 ssh -n multinode-880000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 ssh -n multinode-880000-m03 "sudo cat /home/docker/cp-test_multinode-880000_multinode-880000-m03.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 cp testdata/cp-test.txt multinode-880000-m02:/home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 ssh -n multinode-880000-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 cp multinode-880000-m02:/home/docker/cp-test.txt /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestMultiNodeserialCopyFile382742653/001/cp-test_multinode-880000-m02.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 ssh -n multinode-880000-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 cp multinode-880000-m02:/home/docker/cp-test.txt multinode-880000:/home/docker/cp-test_multinode-880000-m02_multinode-880000.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 ssh -n multinode-880000-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 ssh -n multinode-880000 "sudo cat /home/docker/cp-test_multinode-880000-m02_multinode-880000.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 cp multinode-880000-m02:/home/docker/cp-test.txt multinode-880000-m03:/home/docker/cp-test_multinode-880000-m02_multinode-880000-m03.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 ssh -n multinode-880000-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 ssh -n multinode-880000-m03 "sudo cat /home/docker/cp-test_multinode-880000-m02_multinode-880000-m03.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 cp testdata/cp-test.txt multinode-880000-m03:/home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 ssh -n multinode-880000-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 cp multinode-880000-m03:/home/docker/cp-test.txt /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestMultiNodeserialCopyFile382742653/001/cp-test_multinode-880000-m03.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 ssh -n multinode-880000-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 cp multinode-880000-m03:/home/docker/cp-test.txt multinode-880000:/home/docker/cp-test_multinode-880000-m03_multinode-880000.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 ssh -n multinode-880000-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 ssh -n multinode-880000 "sudo cat /home/docker/cp-test_multinode-880000-m03_multinode-880000.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 cp multinode-880000-m03:/home/docker/cp-test.txt multinode-880000-m02:/home/docker/cp-test_multinode-880000-m03_multinode-880000-m02.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 ssh -n multinode-880000-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 ssh -n multinode-880000-m02 "sudo cat /home/docker/cp-test_multinode-880000-m03_multinode-880000-m02.txt"
--- PASS: TestMultiNode/serial/CopyFile (5.45s)

                                                
                                    
x
+
TestMultiNode/serial/StopNode (2.7s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopNode
multinode_test.go:238: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 node stop m03
multinode_test.go:238: (dbg) Done: out/minikube-darwin-amd64 -p multinode-880000 node stop m03: (2.191620993s)
multinode_test.go:244: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 status
multinode_test.go:244: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-880000 status: exit status 7 (253.708431ms)

                                                
                                                
-- stdout --
	multinode-880000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-880000-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-880000-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:251: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 status --alsologtostderr
multinode_test.go:251: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-880000 status --alsologtostderr: exit status 7 (257.638718ms)

                                                
                                                
-- stdout --
	multinode-880000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-880000-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-880000-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 14:56:06.873785    3593 out.go:296] Setting OutFile to fd 1 ...
	I1218 14:56:06.874084    3593 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1218 14:56:06.874091    3593 out.go:309] Setting ErrFile to fd 2...
	I1218 14:56:06.874096    3593 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1218 14:56:06.874272    3593 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17822-999/.minikube/bin
	I1218 14:56:06.874457    3593 out.go:303] Setting JSON to false
	I1218 14:56:06.874480    3593 mustload.go:65] Loading cluster: multinode-880000
	I1218 14:56:06.874513    3593 notify.go:220] Checking for updates...
	I1218 14:56:06.874827    3593 config.go:182] Loaded profile config "multinode-880000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
	I1218 14:56:06.874840    3593 status.go:255] checking status of multinode-880000 ...
	I1218 14:56:06.875199    3593 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 14:56:06.875265    3593 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 14:56:06.883782    3593 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51573
	I1218 14:56:06.884270    3593 main.go:141] libmachine: () Calling .GetVersion
	I1218 14:56:06.884703    3593 main.go:141] libmachine: Using API Version  1
	I1218 14:56:06.884734    3593 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 14:56:06.884949    3593 main.go:141] libmachine: () Calling .GetMachineName
	I1218 14:56:06.885051    3593 main.go:141] libmachine: (multinode-880000) Calling .GetState
	I1218 14:56:06.885140    3593 main.go:141] libmachine: (multinode-880000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 14:56:06.885198    3593 main.go:141] libmachine: (multinode-880000) DBG | hyperkit pid from json: 3278
	I1218 14:56:06.886368    3593 status.go:330] multinode-880000 host status = "Running" (err=<nil>)
	I1218 14:56:06.886387    3593 host.go:66] Checking if "multinode-880000" exists ...
	I1218 14:56:06.886644    3593 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 14:56:06.886661    3593 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 14:56:06.894383    3593 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51575
	I1218 14:56:06.894719    3593 main.go:141] libmachine: () Calling .GetVersion
	I1218 14:56:06.895070    3593 main.go:141] libmachine: Using API Version  1
	I1218 14:56:06.895088    3593 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 14:56:06.895271    3593 main.go:141] libmachine: () Calling .GetMachineName
	I1218 14:56:06.895365    3593 main.go:141] libmachine: (multinode-880000) Calling .GetIP
	I1218 14:56:06.895448    3593 host.go:66] Checking if "multinode-880000" exists ...
	I1218 14:56:06.895679    3593 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 14:56:06.895708    3593 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 14:56:06.906017    3593 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51577
	I1218 14:56:06.906432    3593 main.go:141] libmachine: () Calling .GetVersion
	I1218 14:56:06.906851    3593 main.go:141] libmachine: Using API Version  1
	I1218 14:56:06.906870    3593 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 14:56:06.907106    3593 main.go:141] libmachine: () Calling .GetMachineName
	I1218 14:56:06.907227    3593 main.go:141] libmachine: (multinode-880000) Calling .DriverName
	I1218 14:56:06.907418    3593 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1218 14:56:06.907442    3593 main.go:141] libmachine: (multinode-880000) Calling .GetSSHHostname
	I1218 14:56:06.907550    3593 main.go:141] libmachine: (multinode-880000) Calling .GetSSHPort
	I1218 14:56:06.907661    3593 main.go:141] libmachine: (multinode-880000) Calling .GetSSHKeyPath
	I1218 14:56:06.907758    3593 main.go:141] libmachine: (multinode-880000) Calling .GetSSHUsername
	I1218 14:56:06.907892    3593 sshutil.go:53] new ssh client: &{IP:192.169.0.13 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17822-999/.minikube/machines/multinode-880000/id_rsa Username:docker}
	I1218 14:56:06.942563    3593 ssh_runner.go:195] Run: systemctl --version
	I1218 14:56:06.946594    3593 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1218 14:56:06.957102    3593 kubeconfig.go:92] found "multinode-880000" server: "https://192.169.0.13:8443"
	I1218 14:56:06.957122    3593 api_server.go:166] Checking apiserver status ...
	I1218 14:56:06.957164    3593 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 14:56:06.966679    3593 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1899/cgroup
	I1218 14:56:06.972725    3593 api_server.go:182] apiserver freezer: "3:freezer:/kubepods/burstable/podb0f2896a29d2d4b462408767d0e96f45/029cd8c167ec32ef20a271aca76c51c6a682c65ca338b59ea53fe16250b110c0"
	I1218 14:56:06.972788    3593 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/kubepods/burstable/podb0f2896a29d2d4b462408767d0e96f45/029cd8c167ec32ef20a271aca76c51c6a682c65ca338b59ea53fe16250b110c0/freezer.state
	I1218 14:56:06.979449    3593 api_server.go:204] freezer state: "THAWED"
	I1218 14:56:06.979464    3593 api_server.go:253] Checking apiserver healthz at https://192.169.0.13:8443/healthz ...
	I1218 14:56:06.982739    3593 api_server.go:279] https://192.169.0.13:8443/healthz returned 200:
	ok
	I1218 14:56:06.982750    3593 status.go:421] multinode-880000 apiserver status = Running (err=<nil>)
	I1218 14:56:06.982760    3593 status.go:257] multinode-880000 status: &{Name:multinode-880000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1218 14:56:06.982769    3593 status.go:255] checking status of multinode-880000-m02 ...
	I1218 14:56:06.983002    3593 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 14:56:06.983022    3593 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 14:56:06.991179    3593 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51581
	I1218 14:56:06.991558    3593 main.go:141] libmachine: () Calling .GetVersion
	I1218 14:56:06.991905    3593 main.go:141] libmachine: Using API Version  1
	I1218 14:56:06.991921    3593 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 14:56:06.992151    3593 main.go:141] libmachine: () Calling .GetMachineName
	I1218 14:56:06.992266    3593 main.go:141] libmachine: (multinode-880000-m02) Calling .GetState
	I1218 14:56:06.992349    3593 main.go:141] libmachine: (multinode-880000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 14:56:06.992427    3593 main.go:141] libmachine: (multinode-880000-m02) DBG | hyperkit pid from json: 3303
	I1218 14:56:06.993601    3593 status.go:330] multinode-880000-m02 host status = "Running" (err=<nil>)
	I1218 14:56:06.993609    3593 host.go:66] Checking if "multinode-880000-m02" exists ...
	I1218 14:56:06.993868    3593 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 14:56:06.993894    3593 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 14:56:07.001808    3593 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51583
	I1218 14:56:07.002152    3593 main.go:141] libmachine: () Calling .GetVersion
	I1218 14:56:07.002477    3593 main.go:141] libmachine: Using API Version  1
	I1218 14:56:07.002490    3593 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 14:56:07.002698    3593 main.go:141] libmachine: () Calling .GetMachineName
	I1218 14:56:07.002783    3593 main.go:141] libmachine: (multinode-880000-m02) Calling .GetIP
	I1218 14:56:07.002877    3593 host.go:66] Checking if "multinode-880000-m02" exists ...
	I1218 14:56:07.003150    3593 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 14:56:07.003177    3593 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 14:56:07.011047    3593 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51585
	I1218 14:56:07.011383    3593 main.go:141] libmachine: () Calling .GetVersion
	I1218 14:56:07.011700    3593 main.go:141] libmachine: Using API Version  1
	I1218 14:56:07.011728    3593 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 14:56:07.011939    3593 main.go:141] libmachine: () Calling .GetMachineName
	I1218 14:56:07.012054    3593 main.go:141] libmachine: (multinode-880000-m02) Calling .DriverName
	I1218 14:56:07.012185    3593 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1218 14:56:07.012196    3593 main.go:141] libmachine: (multinode-880000-m02) Calling .GetSSHHostname
	I1218 14:56:07.012263    3593 main.go:141] libmachine: (multinode-880000-m02) Calling .GetSSHPort
	I1218 14:56:07.012340    3593 main.go:141] libmachine: (multinode-880000-m02) Calling .GetSSHKeyPath
	I1218 14:56:07.012429    3593 main.go:141] libmachine: (multinode-880000-m02) Calling .GetSSHUsername
	I1218 14:56:07.012508    3593 sshutil.go:53] new ssh client: &{IP:192.169.0.14 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17822-999/.minikube/machines/multinode-880000-m02/id_rsa Username:docker}
	I1218 14:56:07.056599    3593 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1218 14:56:07.065130    3593 status.go:257] multinode-880000-m02 status: &{Name:multinode-880000-m02 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}
	I1218 14:56:07.065145    3593 status.go:255] checking status of multinode-880000-m03 ...
	I1218 14:56:07.065402    3593 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 14:56:07.065425    3593 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 14:56:07.073544    3593 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51588
	I1218 14:56:07.073900    3593 main.go:141] libmachine: () Calling .GetVersion
	I1218 14:56:07.074271    3593 main.go:141] libmachine: Using API Version  1
	I1218 14:56:07.074289    3593 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 14:56:07.074520    3593 main.go:141] libmachine: () Calling .GetMachineName
	I1218 14:56:07.074628    3593 main.go:141] libmachine: (multinode-880000-m03) Calling .GetState
	I1218 14:56:07.074704    3593 main.go:141] libmachine: (multinode-880000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 14:56:07.074770    3593 main.go:141] libmachine: (multinode-880000-m03) DBG | hyperkit pid from json: 3390
	I1218 14:56:07.075919    3593 main.go:141] libmachine: (multinode-880000-m03) DBG | hyperkit pid 3390 missing from process table
	I1218 14:56:07.075952    3593 status.go:330] multinode-880000-m03 host status = "Stopped" (err=<nil>)
	I1218 14:56:07.075959    3593 status.go:343] host is not running, skipping remaining checks
	I1218 14:56:07.075965    3593 status.go:257] multinode-880000-m03 status: &{Name:multinode-880000-m03 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopNode (2.70s)

                                                
                                    
x
+
TestMultiNode/serial/StartAfterStop (27.15s)

                                                
                                                
=== RUN   TestMultiNode/serial/StartAfterStop
multinode_test.go:282: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 node start m03 --alsologtostderr
E1218 14:56:22.860375    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
multinode_test.go:282: (dbg) Done: out/minikube-darwin-amd64 -p multinode-880000 node start m03 --alsologtostderr: (26.7828428s)
multinode_test.go:289: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 status
multinode_test.go:303: (dbg) Run:  kubectl get nodes
--- PASS: TestMultiNode/serial/StartAfterStop (27.15s)

                                                
                                    
x
+
TestMultiNode/serial/RestartKeepsNodes (142.39s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartKeepsNodes
multinode_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 node list -p multinode-880000
multinode_test.go:318: (dbg) Run:  out/minikube-darwin-amd64 stop -p multinode-880000
multinode_test.go:318: (dbg) Done: out/minikube-darwin-amd64 stop -p multinode-880000: (18.386468625s)
multinode_test.go:323: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-880000 --wait=true -v=8 --alsologtostderr
E1218 14:58:39.011239    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
multinode_test.go:323: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-880000 --wait=true -v=8 --alsologtostderr: (2m3.88729856s)
multinode_test.go:328: (dbg) Run:  out/minikube-darwin-amd64 node list -p multinode-880000
--- PASS: TestMultiNode/serial/RestartKeepsNodes (142.39s)

                                                
                                    
x
+
TestMultiNode/serial/DeleteNode (2.9s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeleteNode
multinode_test.go:422: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 node delete m03
multinode_test.go:422: (dbg) Done: out/minikube-darwin-amd64 -p multinode-880000 node delete m03: (2.561576482s)
multinode_test.go:428: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 status --alsologtostderr
multinode_test.go:452: (dbg) Run:  kubectl get nodes
multinode_test.go:460: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/DeleteNode (2.90s)

                                                
                                    
x
+
TestMultiNode/serial/StopMultiNode (16.47s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopMultiNode
multinode_test.go:342: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 stop
E1218 14:59:06.701467    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
multinode_test.go:342: (dbg) Done: out/minikube-darwin-amd64 -p multinode-880000 stop: (16.316985577s)
multinode_test.go:348: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 status
multinode_test.go:348: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-880000 status: exit status 7 (77.189972ms)

                                                
                                                
-- stdout --
	multinode-880000
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-880000-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:355: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 status --alsologtostderr
multinode_test.go:355: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-880000 status --alsologtostderr: exit status 7 (77.971844ms)

                                                
                                                
-- stdout --
	multinode-880000
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-880000-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 14:59:15.961874    3725 out.go:296] Setting OutFile to fd 1 ...
	I1218 14:59:15.962091    3725 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1218 14:59:15.962098    3725 out.go:309] Setting ErrFile to fd 2...
	I1218 14:59:15.962102    3725 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1218 14:59:15.962288    3725 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17822-999/.minikube/bin
	I1218 14:59:15.962493    3725 out.go:303] Setting JSON to false
	I1218 14:59:15.962513    3725 mustload.go:65] Loading cluster: multinode-880000
	I1218 14:59:15.962554    3725 notify.go:220] Checking for updates...
	I1218 14:59:15.962873    3725 config.go:182] Loaded profile config "multinode-880000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.4
	I1218 14:59:15.962884    3725 status.go:255] checking status of multinode-880000 ...
	I1218 14:59:15.963271    3725 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 14:59:15.963334    3725 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 14:59:15.971425    3725 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51769
	I1218 14:59:15.971755    3725 main.go:141] libmachine: () Calling .GetVersion
	I1218 14:59:15.972180    3725 main.go:141] libmachine: Using API Version  1
	I1218 14:59:15.972189    3725 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 14:59:15.972385    3725 main.go:141] libmachine: () Calling .GetMachineName
	I1218 14:59:15.972491    3725 main.go:141] libmachine: (multinode-880000) Calling .GetState
	I1218 14:59:15.972606    3725 main.go:141] libmachine: (multinode-880000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 14:59:15.972643    3725 main.go:141] libmachine: (multinode-880000) DBG | hyperkit pid from json: 3651
	I1218 14:59:15.973536    3725 main.go:141] libmachine: (multinode-880000) DBG | hyperkit pid 3651 missing from process table
	I1218 14:59:15.973558    3725 status.go:330] multinode-880000 host status = "Stopped" (err=<nil>)
	I1218 14:59:15.973564    3725 status.go:343] host is not running, skipping remaining checks
	I1218 14:59:15.973569    3725 status.go:257] multinode-880000 status: &{Name:multinode-880000 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1218 14:59:15.973586    3725 status.go:255] checking status of multinode-880000-m02 ...
	I1218 14:59:15.973816    3725 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1218 14:59:15.973838    3725 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I1218 14:59:15.981827    3725 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51771
	I1218 14:59:15.982160    3725 main.go:141] libmachine: () Calling .GetVersion
	I1218 14:59:15.982487    3725 main.go:141] libmachine: Using API Version  1
	I1218 14:59:15.982499    3725 main.go:141] libmachine: () Calling .SetConfigRaw
	I1218 14:59:15.982707    3725 main.go:141] libmachine: () Calling .GetMachineName
	I1218 14:59:15.982816    3725 main.go:141] libmachine: (multinode-880000-m02) Calling .GetState
	I1218 14:59:15.982899    3725 main.go:141] libmachine: (multinode-880000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1218 14:59:15.982960    3725 main.go:141] libmachine: (multinode-880000-m02) DBG | hyperkit pid from json: 3667
	I1218 14:59:15.983863    3725 main.go:141] libmachine: (multinode-880000-m02) DBG | hyperkit pid 3667 missing from process table
	I1218 14:59:15.983893    3725 status.go:330] multinode-880000-m02 host status = "Stopped" (err=<nil>)
	I1218 14:59:15.983899    3725 status.go:343] host is not running, skipping remaining checks
	I1218 14:59:15.983904    3725 status.go:257] multinode-880000-m02 status: &{Name:multinode-880000-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopMultiNode (16.47s)

                                                
                                    
x
+
TestMultiNode/serial/RestartMultiNode (93.85s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartMultiNode
multinode_test.go:382: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-880000 --wait=true -v=8 --alsologtostderr --driver=hyperkit 
E1218 14:59:57.246018    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
E1218 15:00:28.678137    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
multinode_test.go:382: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-880000 --wait=true -v=8 --alsologtostderr --driver=hyperkit : (1m33.518156775s)
multinode_test.go:388: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-880000 status --alsologtostderr
multinode_test.go:402: (dbg) Run:  kubectl get nodes
multinode_test.go:410: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/RestartMultiNode (93.85s)

                                                
                                    
x
+
TestMultiNode/serial/ValidateNameConflict (47.58s)

                                                
                                                
=== RUN   TestMultiNode/serial/ValidateNameConflict
multinode_test.go:471: (dbg) Run:  out/minikube-darwin-amd64 node list -p multinode-880000
multinode_test.go:480: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-880000-m02 --driver=hyperkit 
multinode_test.go:480: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p multinode-880000-m02 --driver=hyperkit : exit status 14 (417.849761ms)

                                                
                                                
-- stdout --
	* [multinode-880000-m02] minikube v1.32.0 on Darwin 14.2
	  - MINIKUBE_LOCATION=17822
	  - KUBECONFIG=/Users/jenkins/minikube-integration/17822-999/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17822-999/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! Profile name 'multinode-880000-m02' is duplicated with machine name 'multinode-880000-m02' in profile 'multinode-880000'
	X Exiting due to MK_USAGE: Profile name should be unique

                                                
                                                
** /stderr **
multinode_test.go:488: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-880000-m03 --driver=hyperkit 
E1218 15:01:20.296199    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
multinode_test.go:488: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-880000-m03 --driver=hyperkit : (38.359700086s)
multinode_test.go:495: (dbg) Run:  out/minikube-darwin-amd64 node add -p multinode-880000
multinode_test.go:495: (dbg) Non-zero exit: out/minikube-darwin-amd64 node add -p multinode-880000: exit status 80 (267.436661ms)

                                                
                                                
-- stdout --
	* Adding node m03 to cluster multinode-880000
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to GUEST_NODE_ADD: failed to add node: Node multinode-880000-m03 already exists in multinode-880000-m03 profile
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                                                         │
	│    * If the above advice does not help, please let us know:                                                             │
	│      https://github.com/kubernetes/minikube/issues/new/choose                                                           │
	│                                                                                                                         │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.                                │
	│    * Please also attach the following file to the GitHub issue:                                                         │
	│    * - /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube_node_040ea7097fd6ed71e65be9a474587f81f0ccd21d_0.log    │
	│                                                                                                                         │
	╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
multinode_test.go:500: (dbg) Run:  out/minikube-darwin-amd64 delete -p multinode-880000-m03
multinode_test.go:500: (dbg) Done: out/minikube-darwin-amd64 delete -p multinode-880000-m03: (8.482477483s)
--- PASS: TestMultiNode/serial/ValidateNameConflict (47.58s)

                                                
                                    
x
+
TestPreload (170.35s)

                                                
                                                
=== RUN   TestPreload
preload_test.go:44: (dbg) Run:  out/minikube-darwin-amd64 start -p test-preload-386000 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.24.4
preload_test.go:44: (dbg) Done: out/minikube-darwin-amd64 start -p test-preload-386000 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.24.4: (1m15.958149794s)
preload_test.go:52: (dbg) Run:  out/minikube-darwin-amd64 -p test-preload-386000 image pull gcr.io/k8s-minikube/busybox
preload_test.go:52: (dbg) Done: out/minikube-darwin-amd64 -p test-preload-386000 image pull gcr.io/k8s-minikube/busybox: (1.165339119s)
preload_test.go:58: (dbg) Run:  out/minikube-darwin-amd64 stop -p test-preload-386000
preload_test.go:58: (dbg) Done: out/minikube-darwin-amd64 stop -p test-preload-386000: (8.241815237s)
preload_test.go:66: (dbg) Run:  out/minikube-darwin-amd64 start -p test-preload-386000 --memory=2200 --alsologtostderr -v=1 --wait=true --driver=hyperkit 
E1218 15:03:39.012343    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
preload_test.go:66: (dbg) Done: out/minikube-darwin-amd64 start -p test-preload-386000 --memory=2200 --alsologtostderr -v=1 --wait=true --driver=hyperkit : (1m19.559754803s)
preload_test.go:71: (dbg) Run:  out/minikube-darwin-amd64 -p test-preload-386000 image list
helpers_test.go:175: Cleaning up "test-preload-386000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p test-preload-386000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p test-preload-386000: (5.269748213s)
--- PASS: TestPreload (170.35s)

                                                
                                    
x
+
TestScheduledStopUnix (103.37s)

                                                
                                                
=== RUN   TestScheduledStopUnix
scheduled_stop_test.go:128: (dbg) Run:  out/minikube-darwin-amd64 start -p scheduled-stop-119000 --memory=2048 --driver=hyperkit 
E1218 15:04:57.245881    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
scheduled_stop_test.go:128: (dbg) Done: out/minikube-darwin-amd64 start -p scheduled-stop-119000 --memory=2048 --driver=hyperkit : (31.855743527s)
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-119000 --schedule 5m
scheduled_stop_test.go:191: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.TimeToStop}} -p scheduled-stop-119000 -n scheduled-stop-119000
scheduled_stop_test.go:169: signal error was:  <nil>
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-119000 --schedule 15s
scheduled_stop_test.go:169: signal error was:  os: process already finished
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-119000 --cancel-scheduled
E1218 15:05:28.678584    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
scheduled_stop_test.go:176: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p scheduled-stop-119000 -n scheduled-stop-119000
scheduled_stop_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 status -p scheduled-stop-119000
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-119000 --schedule 15s
scheduled_stop_test.go:169: signal error was:  os: process already finished
scheduled_stop_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 status -p scheduled-stop-119000
scheduled_stop_test.go:205: (dbg) Non-zero exit: out/minikube-darwin-amd64 status -p scheduled-stop-119000: exit status 7 (73.50601ms)

                                                
                                                
-- stdout --
	scheduled-stop-119000
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	

                                                
                                                
-- /stdout --
scheduled_stop_test.go:176: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p scheduled-stop-119000 -n scheduled-stop-119000
scheduled_stop_test.go:176: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p scheduled-stop-119000 -n scheduled-stop-119000: exit status 7 (67.307648ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
scheduled_stop_test.go:176: status error: exit status 7 (may be ok)
helpers_test.go:175: Cleaning up "scheduled-stop-119000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p scheduled-stop-119000
--- PASS: TestScheduledStopUnix (103.37s)

                                                
                                    
x
+
TestSkaffold (112.49s)

                                                
                                                
=== RUN   TestSkaffold
skaffold_test.go:59: (dbg) Run:  /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/skaffold.exe2384680577 version
skaffold_test.go:63: skaffold version: v2.9.0
skaffold_test.go:66: (dbg) Run:  out/minikube-darwin-amd64 start -p skaffold-080000 --memory=2600 --driver=hyperkit 
E1218 15:06:51.730098    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
skaffold_test.go:66: (dbg) Done: out/minikube-darwin-amd64 start -p skaffold-080000 --memory=2600 --driver=hyperkit : (35.670223326s)
skaffold_test.go:86: copying out/minikube-darwin-amd64 to /Users/jenkins/workspace/out/minikube
skaffold_test.go:105: (dbg) Run:  /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/skaffold.exe2384680577 run --minikube-profile skaffold-080000 --kube-context skaffold-080000 --status-check=true --port-forward=false --interactive=false
skaffold_test.go:105: (dbg) Done: /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/skaffold.exe2384680577 run --minikube-profile skaffold-080000 --kube-context skaffold-080000 --status-check=true --port-forward=false --interactive=false: (58.282370552s)
skaffold_test.go:111: (dbg) TestSkaffold: waiting 1m0s for pods matching "app=leeroy-app" in namespace "default" ...
helpers_test.go:344: "leeroy-app-6dfd6bcb4f-s6gpn" [baf635ef-893c-4c05-a431-eabe0476db55] Running
skaffold_test.go:111: (dbg) TestSkaffold: app=leeroy-app healthy within 6.002785161s
skaffold_test.go:114: (dbg) TestSkaffold: waiting 1m0s for pods matching "app=leeroy-web" in namespace "default" ...
helpers_test.go:344: "leeroy-web-98ddb7976-4wg7z" [6c508dbd-2331-49d0-9311-dba5182efeec] Running
skaffold_test.go:114: (dbg) TestSkaffold: app=leeroy-web healthy within 5.004893825s
helpers_test.go:175: Cleaning up "skaffold-080000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p skaffold-080000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p skaffold-080000: (5.26258059s)
--- PASS: TestSkaffold (112.49s)

                                                
                                    
x
+
TestRunningBinaryUpgrade (230.81s)

                                                
                                                
=== RUN   TestRunningBinaryUpgrade
=== PAUSE TestRunningBinaryUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestRunningBinaryUpgrade
version_upgrade_test.go:133: (dbg) Run:  /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube-v1.6.2.722758097.exe start -p running-upgrade-946000 --memory=2200 --vm-driver=hyperkit 
version_upgrade_test.go:133: (dbg) Done: /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube-v1.6.2.722758097.exe start -p running-upgrade-946000 --memory=2200 --vm-driver=hyperkit : (1m29.260445102s)
version_upgrade_test.go:143: (dbg) Run:  out/minikube-darwin-amd64 start -p running-upgrade-946000 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit 
E1218 15:12:57.756777    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
E1218 15:12:57.762604    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
E1218 15:12:57.773109    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
E1218 15:12:57.795334    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
E1218 15:12:57.835620    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
E1218 15:12:57.915949    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
E1218 15:12:58.076671    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
E1218 15:12:58.397311    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
E1218 15:12:59.037535    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
E1218 15:13:00.318020    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
E1218 15:13:02.879200    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
E1218 15:13:07.999357    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
E1218 15:13:18.240948    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
version_upgrade_test.go:143: (dbg) Done: out/minikube-darwin-amd64 start -p running-upgrade-946000 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit : (2m15.371483867s)
helpers_test.go:175: Cleaning up "running-upgrade-946000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p running-upgrade-946000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p running-upgrade-946000: (5.300383573s)
--- PASS: TestRunningBinaryUpgrade (230.81s)

                                                
                                    
x
+
TestKubernetesUpgrade (189.67s)

                                                
                                                
=== RUN   TestKubernetesUpgrade
=== PAUSE TestKubernetesUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestKubernetesUpgrade
version_upgrade_test.go:235: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-780000 --memory=2200 --kubernetes-version=v1.16.0 --alsologtostderr -v=1 --driver=hyperkit 
E1218 15:14:19.684185    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
version_upgrade_test.go:235: (dbg) Done: out/minikube-darwin-amd64 start -p kubernetes-upgrade-780000 --memory=2200 --kubernetes-version=v1.16.0 --alsologtostderr -v=1 --driver=hyperkit : (1m13.936687559s)
version_upgrade_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 stop -p kubernetes-upgrade-780000
version_upgrade_test.go:240: (dbg) Done: out/minikube-darwin-amd64 stop -p kubernetes-upgrade-780000: (2.236553999s)
version_upgrade_test.go:245: (dbg) Run:  out/minikube-darwin-amd64 -p kubernetes-upgrade-780000 status --format={{.Host}}
version_upgrade_test.go:245: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p kubernetes-upgrade-780000 status --format={{.Host}}: exit status 7 (69.059439ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
version_upgrade_test.go:247: status error: exit status 7 (may be ok)
version_upgrade_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-780000 --memory=2200 --kubernetes-version=v1.29.0-rc.2 --alsologtostderr -v=1 --driver=hyperkit 
E1218 15:15:28.680530    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
E1218 15:15:41.605367    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
version_upgrade_test.go:256: (dbg) Done: out/minikube-darwin-amd64 start -p kubernetes-upgrade-780000 --memory=2200 --kubernetes-version=v1.29.0-rc.2 --alsologtostderr -v=1 --driver=hyperkit : (1m7.362339647s)
version_upgrade_test.go:261: (dbg) Run:  kubectl --context kubernetes-upgrade-780000 version --output=json
version_upgrade_test.go:280: Attempting to downgrade Kubernetes (should fail)
version_upgrade_test.go:282: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-780000 --memory=2200 --kubernetes-version=v1.16.0 --driver=hyperkit 
version_upgrade_test.go:282: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p kubernetes-upgrade-780000 --memory=2200 --kubernetes-version=v1.16.0 --driver=hyperkit : exit status 106 (501.137903ms)

                                                
                                                
-- stdout --
	* [kubernetes-upgrade-780000] minikube v1.32.0 on Darwin 14.2
	  - MINIKUBE_LOCATION=17822
	  - KUBECONFIG=/Users/jenkins/minikube-integration/17822-999/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17822-999/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to K8S_DOWNGRADE_UNSUPPORTED: Unable to safely downgrade existing Kubernetes v1.29.0-rc.2 cluster to v1.16.0
	* Suggestion: 
	
	    1) Recreate the cluster with Kubernetes 1.16.0, by running:
	    
	    minikube delete -p kubernetes-upgrade-780000
	    minikube start -p kubernetes-upgrade-780000 --kubernetes-version=v1.16.0
	    
	    2) Create a second cluster with Kubernetes 1.16.0, by running:
	    
	    minikube start -p kubernetes-upgrade-7800002 --kubernetes-version=v1.16.0
	    
	    3) Use the existing cluster at version Kubernetes 1.29.0-rc.2, by running:
	    
	    minikube start -p kubernetes-upgrade-780000 --kubernetes-version=v1.29.0-rc.2
	    

                                                
                                                
** /stderr **
version_upgrade_test.go:286: Attempting restart after unsuccessful downgrade
version_upgrade_test.go:288: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-780000 --memory=2200 --kubernetes-version=v1.29.0-rc.2 --alsologtostderr -v=1 --driver=hyperkit 
version_upgrade_test.go:288: (dbg) Done: out/minikube-darwin-amd64 start -p kubernetes-upgrade-780000 --memory=2200 --kubernetes-version=v1.29.0-rc.2 --alsologtostderr -v=1 --driver=hyperkit : (41.981604149s)
helpers_test.go:175: Cleaning up "kubernetes-upgrade-780000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p kubernetes-upgrade-780000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p kubernetes-upgrade-780000: (3.531741993s)
--- PASS: TestKubernetesUpgrade (189.67s)

                                                
                                    
x
+
TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current (3.98s)

                                                
                                                
=== RUN   TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current
* minikube v1.32.0 on darwin
- MINIKUBE_LOCATION=17822
- KUBECONFIG=/Users/jenkins/minikube-integration/17822-999/kubeconfig
- MINIKUBE_BIN=out/minikube-darwin-amd64
- MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
- MINIKUBE_FORCE_SYSTEMD=
- MINIKUBE_HOME=/var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current2360697385/001
* Using the hyperkit driver based on user configuration
* The 'hyperkit' driver requires elevated permissions. The following commands will be executed:

                                                
                                                
$ sudo chown root:wheel /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current2360697385/001/.minikube/bin/docker-machine-driver-hyperkit 
$ sudo chmod u+s /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current2360697385/001/.minikube/bin/docker-machine-driver-hyperkit 

                                                
                                                

                                                
                                                
! Unable to update hyperkit driver: [sudo chown root:wheel /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current2360697385/001/.minikube/bin/docker-machine-driver-hyperkit] requires a password, and --interactive=false
* Starting control plane node minikube in cluster minikube
* Download complete!
--- PASS: TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current (3.98s)

                                                
                                    
x
+
TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current (6.84s)

                                                
                                                
=== RUN   TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current
* minikube v1.32.0 on darwin
- MINIKUBE_LOCATION=17822
- KUBECONFIG=/Users/jenkins/minikube-integration/17822-999/kubeconfig
- MINIKUBE_BIN=out/minikube-darwin-amd64
- MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
- MINIKUBE_FORCE_SYSTEMD=
- MINIKUBE_HOME=/var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current3997309873/001
* Using the hyperkit driver based on user configuration
* Downloading driver docker-machine-driver-hyperkit:
* The 'hyperkit' driver requires elevated permissions. The following commands will be executed:

                                                
                                                
$ sudo chown root:wheel /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current3997309873/001/.minikube/bin/docker-machine-driver-hyperkit 
$ sudo chmod u+s /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current3997309873/001/.minikube/bin/docker-machine-driver-hyperkit 

                                                
                                                

                                                
                                                
! Unable to update hyperkit driver: [sudo chown root:wheel /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current3997309873/001/.minikube/bin/docker-machine-driver-hyperkit] requires a password, and --interactive=false
* Starting control plane node minikube in cluster minikube
* Download complete!
--- PASS: TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current (6.84s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Setup (0.92s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Setup
--- PASS: TestStoppedBinaryUpgrade/Setup (0.92s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Upgrade (152.38s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Upgrade
version_upgrade_test.go:196: (dbg) Run:  /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube-v1.6.2.2083875391.exe start -p stopped-upgrade-534000 --memory=2200 --vm-driver=hyperkit 
E1218 15:14:57.248711    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
version_upgrade_test.go:196: (dbg) Done: /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube-v1.6.2.2083875391.exe start -p stopped-upgrade-534000 --memory=2200 --vm-driver=hyperkit : (1m24.395608576s)
version_upgrade_test.go:205: (dbg) Run:  /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube-v1.6.2.2083875391.exe -p stopped-upgrade-534000 stop
version_upgrade_test.go:205: (dbg) Done: /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube-v1.6.2.2083875391.exe -p stopped-upgrade-534000 stop: (8.072381342s)
version_upgrade_test.go:211: (dbg) Run:  out/minikube-darwin-amd64 start -p stopped-upgrade-534000 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit 
version_upgrade_test.go:211: (dbg) Done: out/minikube-darwin-amd64 start -p stopped-upgrade-534000 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit : (59.912518311s)
--- PASS: TestStoppedBinaryUpgrade/Upgrade (152.38s)

                                                
                                    
x
+
TestPause/serial/Start (48.92s)

                                                
                                                
=== RUN   TestPause/serial/Start
pause_test.go:80: (dbg) Run:  out/minikube-darwin-amd64 start -p pause-356000 --memory=2048 --install-addons=false --wait=all --driver=hyperkit 
pause_test.go:80: (dbg) Done: out/minikube-darwin-amd64 start -p pause-356000 --memory=2048 --install-addons=false --wait=all --driver=hyperkit : (48.916363692s)
--- PASS: TestPause/serial/Start (48.92s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/MinikubeLogs (2.43s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/MinikubeLogs
version_upgrade_test.go:219: (dbg) Run:  out/minikube-darwin-amd64 logs -p stopped-upgrade-534000
version_upgrade_test.go:219: (dbg) Done: out/minikube-darwin-amd64 logs -p stopped-upgrade-534000: (2.433611197s)
--- PASS: TestStoppedBinaryUpgrade/MinikubeLogs (2.43s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoK8sWithVersion (0.44s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoK8sWithVersion
no_kubernetes_test.go:83: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-601000 --no-kubernetes --kubernetes-version=1.20 --driver=hyperkit 
no_kubernetes_test.go:83: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p NoKubernetes-601000 --no-kubernetes --kubernetes-version=1.20 --driver=hyperkit : exit status 14 (443.790638ms)

                                                
                                                
-- stdout --
	* [NoKubernetes-601000] minikube v1.32.0 on Darwin 14.2
	  - MINIKUBE_LOCATION=17822
	  - KUBECONFIG=/Users/jenkins/minikube-integration/17822-999/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17822-999/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to MK_USAGE: cannot specify --kubernetes-version with --no-kubernetes,
	to unset a global config run:
	
	$ minikube config unset kubernetes-version

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/StartNoK8sWithVersion (0.44s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithK8s (40.86s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithK8s
no_kubernetes_test.go:95: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-601000 --driver=hyperkit 
E1218 15:17:57.756310    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
E1218 15:18:00.299980    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
no_kubernetes_test.go:95: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-601000 --driver=hyperkit : (40.692065435s)
no_kubernetes_test.go:200: (dbg) Run:  out/minikube-darwin-amd64 -p NoKubernetes-601000 status -o json
--- PASS: TestNoKubernetes/serial/StartWithK8s (40.86s)

                                                
                                    
x
+
TestPause/serial/SecondStartNoReconfiguration (31.42s)

                                                
                                                
=== RUN   TestPause/serial/SecondStartNoReconfiguration
pause_test.go:92: (dbg) Run:  out/minikube-darwin-amd64 start -p pause-356000 --alsologtostderr -v=1 --driver=hyperkit 
pause_test.go:92: (dbg) Done: out/minikube-darwin-amd64 start -p pause-356000 --alsologtostderr -v=1 --driver=hyperkit : (31.405077469s)
--- PASS: TestPause/serial/SecondStartNoReconfiguration (31.42s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithStopK8s (16.28s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithStopK8s
no_kubernetes_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-601000 --no-kubernetes --driver=hyperkit 
no_kubernetes_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-601000 --no-kubernetes --driver=hyperkit : (13.686064458s)
no_kubernetes_test.go:200: (dbg) Run:  out/minikube-darwin-amd64 -p NoKubernetes-601000 status -o json
no_kubernetes_test.go:200: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p NoKubernetes-601000 status -o json: exit status 2 (142.789694ms)

                                                
                                                
-- stdout --
	{"Name":"NoKubernetes-601000","Host":"Running","Kubelet":"Stopped","APIServer":"Stopped","Kubeconfig":"Configured","Worker":false}

                                                
                                                
-- /stdout --
no_kubernetes_test.go:124: (dbg) Run:  out/minikube-darwin-amd64 delete -p NoKubernetes-601000
no_kubernetes_test.go:124: (dbg) Done: out/minikube-darwin-amd64 delete -p NoKubernetes-601000: (2.448998152s)
--- PASS: TestNoKubernetes/serial/StartWithStopK8s (16.28s)

                                                
                                    
x
+
TestNoKubernetes/serial/Start (16.14s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Start
no_kubernetes_test.go:136: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-601000 --no-kubernetes --driver=hyperkit 
E1218 15:18:25.447572    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
no_kubernetes_test.go:136: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-601000 --no-kubernetes --driver=hyperkit : (16.135735169s)
--- PASS: TestNoKubernetes/serial/Start (16.14s)

                                                
                                    
x
+
TestPause/serial/Pause (0.52s)

                                                
                                                
=== RUN   TestPause/serial/Pause
pause_test.go:110: (dbg) Run:  out/minikube-darwin-amd64 pause -p pause-356000 --alsologtostderr -v=5
--- PASS: TestPause/serial/Pause (0.52s)

                                                
                                    
x
+
TestPause/serial/VerifyStatus (0.16s)

                                                
                                                
=== RUN   TestPause/serial/VerifyStatus
status_test.go:76: (dbg) Run:  out/minikube-darwin-amd64 status -p pause-356000 --output=json --layout=cluster
status_test.go:76: (dbg) Non-zero exit: out/minikube-darwin-amd64 status -p pause-356000 --output=json --layout=cluster: exit status 2 (157.949817ms)

                                                
                                                
-- stdout --
	{"Name":"pause-356000","StatusCode":418,"StatusName":"Paused","Step":"Done","StepDetail":"* Paused 12 containers in: kube-system, kubernetes-dashboard, storage-gluster, istio-operator","BinaryVersion":"v1.32.0","Components":{"kubeconfig":{"Name":"kubeconfig","StatusCode":200,"StatusName":"OK"}},"Nodes":[{"Name":"pause-356000","StatusCode":200,"StatusName":"OK","Components":{"apiserver":{"Name":"apiserver","StatusCode":418,"StatusName":"Paused"},"kubelet":{"Name":"kubelet","StatusCode":405,"StatusName":"Stopped"}}}]}

                                                
                                                
-- /stdout --
--- PASS: TestPause/serial/VerifyStatus (0.16s)

                                                
                                    
x
+
TestPause/serial/Unpause (0.52s)

                                                
                                                
=== RUN   TestPause/serial/Unpause
pause_test.go:121: (dbg) Run:  out/minikube-darwin-amd64 unpause -p pause-356000 --alsologtostderr -v=5
--- PASS: TestPause/serial/Unpause (0.52s)

                                                
                                    
x
+
TestPause/serial/PauseAgain (0.58s)

                                                
                                                
=== RUN   TestPause/serial/PauseAgain
pause_test.go:110: (dbg) Run:  out/minikube-darwin-amd64 pause -p pause-356000 --alsologtostderr -v=5
--- PASS: TestPause/serial/PauseAgain (0.58s)

                                                
                                    
x
+
TestPause/serial/DeletePaused (5.24s)

                                                
                                                
=== RUN   TestPause/serial/DeletePaused
pause_test.go:132: (dbg) Run:  out/minikube-darwin-amd64 delete -p pause-356000 --alsologtostderr -v=5
pause_test.go:132: (dbg) Done: out/minikube-darwin-amd64 delete -p pause-356000 --alsologtostderr -v=5: (5.242518308s)
--- PASS: TestPause/serial/DeletePaused (5.24s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunning (0.14s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunning
no_kubernetes_test.go:147: (dbg) Run:  out/minikube-darwin-amd64 ssh -p NoKubernetes-601000 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:147: (dbg) Non-zero exit: out/minikube-darwin-amd64 ssh -p NoKubernetes-601000 "sudo systemctl is-active --quiet service kubelet": exit status 1 (135.497764ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunning (0.14s)

                                                
                                    
x
+
TestNoKubernetes/serial/ProfileList (30.89s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/ProfileList
no_kubernetes_test.go:169: (dbg) Run:  out/minikube-darwin-amd64 profile list
E1218 15:18:39.014331    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
no_kubernetes_test.go:169: (dbg) Done: out/minikube-darwin-amd64 profile list: (3.943308649s)
no_kubernetes_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 profile list --output=json
no_kubernetes_test.go:179: (dbg) Done: out/minikube-darwin-amd64 profile list --output=json: (26.941954961s)
--- PASS: TestNoKubernetes/serial/ProfileList (30.89s)

                                                
                                    
x
+
TestPause/serial/VerifyDeletedResources (0.26s)

                                                
                                                
=== RUN   TestPause/serial/VerifyDeletedResources
pause_test.go:142: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestPause/serial/VerifyDeletedResources (0.26s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/Start (90.26s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p auto-302000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --driver=hyperkit 
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p auto-302000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --driver=hyperkit : (1m30.255667177s)
--- PASS: TestNetworkPlugins/group/auto/Start (90.26s)

                                                
                                    
x
+
TestNoKubernetes/serial/Stop (2.24s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Stop
no_kubernetes_test.go:158: (dbg) Run:  out/minikube-darwin-amd64 stop -p NoKubernetes-601000
no_kubernetes_test.go:158: (dbg) Done: out/minikube-darwin-amd64 stop -p NoKubernetes-601000: (2.23549286s)
--- PASS: TestNoKubernetes/serial/Stop (2.24s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoArgs (17.33s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoArgs
no_kubernetes_test.go:191: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-601000 --driver=hyperkit 
no_kubernetes_test.go:191: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-601000 --driver=hyperkit : (17.331945449s)
--- PASS: TestNoKubernetes/serial/StartNoArgs (17.33s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.14s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunningSecond
no_kubernetes_test.go:147: (dbg) Run:  out/minikube-darwin-amd64 ssh -p NoKubernetes-601000 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:147: (dbg) Non-zero exit: out/minikube-darwin-amd64 ssh -p NoKubernetes-601000 "sudo systemctl is-active --quiet service kubelet": exit status 1 (136.689018ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/Start (58.72s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p kindnet-302000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=kindnet --driver=hyperkit 
E1218 15:19:57.250171    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p kindnet-302000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=kindnet --driver=hyperkit : (58.719854434s)
--- PASS: TestNetworkPlugins/group/kindnet/Start (58.72s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/KubeletFlags (0.16s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p auto-302000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/auto/KubeletFlags (0.16s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/NetCatPod (13.32s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context auto-302000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/auto/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-b27qt" [6c11acd4-3b8b-46d8-9070-df6d13d3065a] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-56589dfd74-b27qt" [6c11acd4-3b8b-46d8-9070-df6d13d3065a] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/auto/NetCatPod: app=netcat healthy within 13.004345313s
--- PASS: TestNetworkPlugins/group/auto/NetCatPod (13.32s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/DNS (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/DNS
net_test.go:175: (dbg) Run:  kubectl --context auto-302000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/auto/DNS (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/Localhost (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/Localhost
net_test.go:194: (dbg) Run:  kubectl --context auto-302000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/auto/Localhost (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/HairPin (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/HairPin
net_test.go:264: (dbg) Run:  kubectl --context auto-302000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/auto/HairPin (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/ControllerPod (6.01s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/ControllerPod
net_test.go:120: (dbg) TestNetworkPlugins/group/kindnet/ControllerPod: waiting 10m0s for pods matching "app=kindnet" in namespace "kube-system" ...
helpers_test.go:344: "kindnet-89h6s" [25f97cfd-89d5-4796-a89e-347b81fd651e] Running
E1218 15:20:28.681942    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
net_test.go:120: (dbg) TestNetworkPlugins/group/kindnet/ControllerPod: app=kindnet healthy within 6.004470576s
--- PASS: TestNetworkPlugins/group/kindnet/ControllerPod (6.01s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/KubeletFlags (0.16s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p kindnet-302000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/kindnet/KubeletFlags (0.16s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/NetCatPod (12.21s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context kindnet-302000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/kindnet/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-w4jbz" [22fdac91-46c8-4d3a-b546-88a04d84655e] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-56589dfd74-w4jbz" [22fdac91-46c8-4d3a-b546-88a04d84655e] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/kindnet/NetCatPod: app=netcat healthy within 12.002540552s
--- PASS: TestNetworkPlugins/group/kindnet/NetCatPod (12.21s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/Start (80.41s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p calico-302000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=calico --driver=hyperkit 
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p calico-302000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=calico --driver=hyperkit : (1m20.410387678s)
--- PASS: TestNetworkPlugins/group/calico/Start (80.41s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/DNS (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/DNS
net_test.go:175: (dbg) Run:  kubectl --context kindnet-302000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/kindnet/DNS (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/Localhost
net_test.go:194: (dbg) Run:  kubectl --context kindnet-302000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/kindnet/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/HairPin (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/HairPin
net_test.go:264: (dbg) Run:  kubectl --context kindnet-302000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/kindnet/HairPin (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/Start (59.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p custom-flannel-302000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=testdata/kube-flannel.yaml --driver=hyperkit 
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p custom-flannel-302000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=testdata/kube-flannel.yaml --driver=hyperkit : (59.102202608s)
--- PASS: TestNetworkPlugins/group/custom-flannel/Start (59.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/ControllerPod (6.01s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/ControllerPod
net_test.go:120: (dbg) TestNetworkPlugins/group/calico/ControllerPod: waiting 10m0s for pods matching "k8s-app=calico-node" in namespace "kube-system" ...
helpers_test.go:344: "calico-node-fwv26" [7f6e2c31-2715-4540-be8a-f17f81c0d7f1] Running
net_test.go:120: (dbg) TestNetworkPlugins/group/calico/ControllerPod: k8s-app=calico-node healthy within 6.005344751s
--- PASS: TestNetworkPlugins/group/calico/ControllerPod (6.01s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p custom-flannel-302000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/custom-flannel/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/NetCatPod (11.17s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context custom-flannel-302000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/custom-flannel/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-72lgs" [2b9ad268-31d4-4592-973e-b7fa6d270c0b] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-56589dfd74-72lgs" [2b9ad268-31d4-4592-973e-b7fa6d270c0b] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/custom-flannel/NetCatPod: app=netcat healthy within 11.002343909s
--- PASS: TestNetworkPlugins/group/custom-flannel/NetCatPod (11.17s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/KubeletFlags (0.17s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p calico-302000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/calico/KubeletFlags (0.17s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/NetCatPod (10.22s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context calico-302000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/calico/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-rmc72" [7bc64ccb-5213-4f4b-8e98-057dc7c4aadc] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-56589dfd74-rmc72" [7bc64ccb-5213-4f4b-8e98-057dc7c4aadc] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/calico/NetCatPod: app=netcat healthy within 10.004710926s
--- PASS: TestNetworkPlugins/group/calico/NetCatPod (10.22s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/DNS (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/DNS
net_test.go:175: (dbg) Run:  kubectl --context custom-flannel-302000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/custom-flannel/DNS (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/Localhost (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/Localhost
net_test.go:194: (dbg) Run:  kubectl --context custom-flannel-302000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/custom-flannel/Localhost (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/HairPin (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/HairPin
net_test.go:264: (dbg) Run:  kubectl --context custom-flannel-302000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/custom-flannel/HairPin (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/DNS (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/DNS
net_test.go:175: (dbg) Run:  kubectl --context calico-302000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/calico/DNS (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/Localhost
net_test.go:194: (dbg) Run:  kubectl --context calico-302000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/calico/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/HairPin (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/HairPin
net_test.go:264: (dbg) Run:  kubectl --context calico-302000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/calico/HairPin (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/Start (50.6s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p false-302000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=false --driver=hyperkit 
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p false-302000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=false --driver=hyperkit : (50.596511455s)
--- PASS: TestNetworkPlugins/group/false/Start (50.60s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/Start (58.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p enable-default-cni-302000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --enable-default-cni=true --driver=hyperkit 
E1218 15:22:57.864264    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p enable-default-cni-302000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --enable-default-cni=true --driver=hyperkit : (58.122590582s)
--- PASS: TestNetworkPlugins/group/enable-default-cni/Start (58.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/KubeletFlags (0.16s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p false-302000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/false/KubeletFlags (0.16s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/NetCatPod (12.22s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context false-302000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/false/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-5vwq7" [a3d18454-e8ab-421c-900a-440254b0b648] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-56589dfd74-5vwq7" [a3d18454-e8ab-421c-900a-440254b0b648] Running
E1218 15:23:31.840614    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
net_test.go:163: (dbg) TestNetworkPlugins/group/false/NetCatPod: app=netcat healthy within 12.004174475s
--- PASS: TestNetworkPlugins/group/false/NetCatPod (12.22s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/DNS (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/DNS
net_test.go:175: (dbg) Run:  kubectl --context false-302000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/false/DNS (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/Localhost
net_test.go:194: (dbg) Run:  kubectl --context false-302000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/false/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/HairPin (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/HairPin
net_test.go:264: (dbg) Run:  kubectl --context false-302000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/false/HairPin (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/KubeletFlags (0.17s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p enable-default-cni-302000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/enable-default-cni/KubeletFlags (0.17s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/NetCatPod (12.18s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context enable-default-cni-302000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/enable-default-cni/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-9ztxc" [2388c70b-c2f2-45b7-9b1c-a787d7b8d413] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
E1218 15:23:39.122322    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
helpers_test.go:344: "netcat-56589dfd74-9ztxc" [2388c70b-c2f2-45b7-9b1c-a787d7b8d413] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/enable-default-cni/NetCatPod: app=netcat healthy within 12.005201574s
--- PASS: TestNetworkPlugins/group/enable-default-cni/NetCatPod (12.18s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/DNS (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/DNS
net_test.go:175: (dbg) Run:  kubectl --context enable-default-cni-302000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/enable-default-cni/DNS (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/Localhost (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/Localhost
net_test.go:194: (dbg) Run:  kubectl --context enable-default-cni-302000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/enable-default-cni/Localhost (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/HairPin (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/HairPin
net_test.go:264: (dbg) Run:  kubectl --context enable-default-cni-302000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/enable-default-cni/HairPin (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/Start (59.03s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p flannel-302000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=flannel --driver=hyperkit 
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p flannel-302000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=flannel --driver=hyperkit : (59.034015987s)
--- PASS: TestNetworkPlugins/group/flannel/Start (59.03s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/Start (49.31s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p bridge-302000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=bridge --driver=hyperkit 
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p bridge-302000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=bridge --driver=hyperkit : (49.308760334s)
--- PASS: TestNetworkPlugins/group/bridge/Start (49.31s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/ControllerPod (6s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/ControllerPod
net_test.go:120: (dbg) TestNetworkPlugins/group/flannel/ControllerPod: waiting 10m0s for pods matching "app=flannel" in namespace "kube-flannel" ...
helpers_test.go:344: "kube-flannel-ds-xgg56" [fa405ff9-6644-4dc5-858d-cc59bfe67c2b] Running
net_test.go:120: (dbg) TestNetworkPlugins/group/flannel/ControllerPod: app=flannel healthy within 6.002360838s
--- PASS: TestNetworkPlugins/group/flannel/ControllerPod (6.00s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/KubeletFlags (0.16s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p bridge-302000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/bridge/KubeletFlags (0.16s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/NetCatPod (10.18s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context bridge-302000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/bridge/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-drg2x" [b0f7875d-3f24-4638-9b0a-59e0708b04a7] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
E1218 15:24:57.357253    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
helpers_test.go:344: "netcat-56589dfd74-drg2x" [b0f7875d-3f24-4638-9b0a-59e0708b04a7] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/bridge/NetCatPod: app=netcat healthy within 10.004680241s
--- PASS: TestNetworkPlugins/group/bridge/NetCatPod (10.18s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/KubeletFlags (0.17s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p flannel-302000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/flannel/KubeletFlags (0.17s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/NetCatPod (11.18s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context flannel-302000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/flannel/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-bzl9l" [d71b81e8-51a7-42d5-96ca-6793ecc5085d] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-56589dfd74-bzl9l" [d71b81e8-51a7-42d5-96ca-6793ecc5085d] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/flannel/NetCatPod: app=netcat healthy within 11.003507724s
--- PASS: TestNetworkPlugins/group/flannel/NetCatPod (11.18s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/DNS (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/DNS
net_test.go:175: (dbg) Run:  kubectl --context bridge-302000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/bridge/DNS (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/Localhost (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/Localhost
net_test.go:194: (dbg) Run:  kubectl --context bridge-302000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/bridge/Localhost (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/HairPin (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/HairPin
net_test.go:264: (dbg) Run:  kubectl --context bridge-302000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/bridge/HairPin (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/DNS (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/DNS
net_test.go:175: (dbg) Run:  kubectl --context flannel-302000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/flannel/DNS (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/Localhost (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/Localhost
net_test.go:194: (dbg) Run:  kubectl --context flannel-302000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/flannel/Localhost (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/HairPin (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/HairPin
net_test.go:264: (dbg) Run:  kubectl --context flannel-302000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/flannel/HairPin (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/Start (88.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p kubenet-302000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --network-plugin=kubenet --driver=hyperkit 
E1218 15:25:26.991303    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kindnet-302000/client.crt: no such file or directory
E1218 15:25:26.997109    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kindnet-302000/client.crt: no such file or directory
E1218 15:25:27.007178    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kindnet-302000/client.crt: no such file or directory
E1218 15:25:27.028646    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kindnet-302000/client.crt: no such file or directory
E1218 15:25:27.069318    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kindnet-302000/client.crt: no such file or directory
E1218 15:25:27.150583    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kindnet-302000/client.crt: no such file or directory
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p kubenet-302000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --network-plugin=kubenet --driver=hyperkit : (1m28.12786519s)
--- PASS: TestNetworkPlugins/group/kubenet/Start (88.13s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/FirstStart (160.11s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p old-k8s-version-920000 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=hyperkit  --kubernetes-version=v1.16.0
E1218 15:25:27.312880    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kindnet-302000/client.crt: no such file or directory
E1218 15:25:27.633155    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kindnet-302000/client.crt: no such file or directory
E1218 15:25:28.273452    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kindnet-302000/client.crt: no such file or directory
E1218 15:25:28.791709    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
E1218 15:25:29.553777    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kindnet-302000/client.crt: no such file or directory
E1218 15:25:31.845537    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/auto-302000/client.crt: no such file or directory
E1218 15:25:32.114745    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kindnet-302000/client.crt: no such file or directory
E1218 15:25:37.235107    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kindnet-302000/client.crt: no such file or directory
E1218 15:25:47.476576    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kindnet-302000/client.crt: no such file or directory
E1218 15:25:52.326783    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/auto-302000/client.crt: no such file or directory
E1218 15:26:07.957110    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kindnet-302000/client.crt: no such file or directory
E1218 15:26:33.287705    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/auto-302000/client.crt: no such file or directory
E1218 15:26:42.178159    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
E1218 15:26:48.918398    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kindnet-302000/client.crt: no such file or directory
start_stop_delete_test.go:186: (dbg) Done: out/minikube-darwin-amd64 start -p old-k8s-version-920000 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=hyperkit  --kubernetes-version=v1.16.0: (2m40.113845795s)
--- PASS: TestStartStop/group/old-k8s-version/serial/FirstStart (160.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/KubeletFlags (0.17s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p kubenet-302000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/kubenet/KubeletFlags (0.17s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/NetCatPod (11.19s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context kubenet-302000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/kubenet/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-q58pk" [a665decb-826b-48f2-b945-96fad22c5171] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-56589dfd74-q58pk" [a665decb-826b-48f2-b945-96fad22c5171] Running
E1218 15:27:02.313259    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/calico-302000/client.crt: no such file or directory
E1218 15:27:02.319378    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/calico-302000/client.crt: no such file or directory
E1218 15:27:02.331096    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/calico-302000/client.crt: no such file or directory
E1218 15:27:02.351542    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/calico-302000/client.crt: no such file or directory
E1218 15:27:02.392785    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/calico-302000/client.crt: no such file or directory
E1218 15:27:02.474545    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/calico-302000/client.crt: no such file or directory
E1218 15:27:02.635990    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/calico-302000/client.crt: no such file or directory
net_test.go:163: (dbg) TestNetworkPlugins/group/kubenet/NetCatPod: app=netcat healthy within 11.00502683s
--- PASS: TestNetworkPlugins/group/kubenet/NetCatPod (11.19s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/DNS (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/DNS
net_test.go:175: (dbg) Run:  kubectl --context kubenet-302000 exec deployment/netcat -- nslookup kubernetes.default
E1218 15:27:02.833418    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/custom-flannel-302000/client.crt: no such file or directory
E1218 15:27:02.839270    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/custom-flannel-302000/client.crt: no such file or directory
E1218 15:27:02.849409    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/custom-flannel-302000/client.crt: no such file or directory
--- PASS: TestNetworkPlugins/group/kubenet/DNS (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/Localhost
net_test.go:194: (dbg) Run:  kubectl --context kubenet-302000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
E1218 15:27:02.870612    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/custom-flannel-302000/client.crt: no such file or directory
E1218 15:27:02.911194    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/custom-flannel-302000/client.crt: no such file or directory
E1218 15:27:02.957170    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/calico-302000/client.crt: no such file or directory
--- PASS: TestNetworkPlugins/group/kubenet/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/HairPin (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/HairPin
net_test.go:264: (dbg) Run:  kubectl --context kubenet-302000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
E1218 15:27:02.991621    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/custom-flannel-302000/client.crt: no such file or directory
--- PASS: TestNetworkPlugins/group/kubenet/HairPin (0.11s)
E1218 16:08:22.862343    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/false-302000/client.crt: no such file or directory
E1218 16:08:36.218146    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/enable-default-cni-302000/client.crt: no such file or directory
E1218 16:08:39.243437    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
E1218 16:09:25.443821    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/no-preload-994000/client.crt: no such file or directory
E1218 16:09:51.566439    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/flannel-302000/client.crt: no such file or directory
E1218 16:09:55.705378    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/bridge-302000/client.crt: no such file or directory
E1218 16:09:57.477486    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
E1218 16:10:11.475786    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/auto-302000/client.crt: no such file or directory
E1218 16:10:27.111843    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kindnet-302000/client.crt: no such file or directory
E1218 16:10:28.911147    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
E1218 16:11:10.755601    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/old-k8s-version-920000/client.crt: no such file or directory
E1218 16:11:22.392383    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/no-preload-994000/client.crt: no such file or directory
E1218 16:11:51.839913    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kubenet-302000/client.crt: no such file or directory
E1218 16:12:02.431260    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/calico-302000/client.crt: no such file or directory
E1218 16:12:02.953002    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/custom-flannel-302000/client.crt: no such file or directory
E1218 16:12:57.988947    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
E1218 16:13:07.705125    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/old-k8s-version-920000/client.crt: no such file or directory
E1218 16:13:22.864671    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/false-302000/client.crt: no such file or directory
E1218 16:13:31.966532    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
E1218 16:13:36.220220    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/enable-default-cni-302000/client.crt: no such file or directory
E1218 16:13:39.245307    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/DeployApp (9.31s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context old-k8s-version-920000 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) TestStartStop/group/old-k8s-version/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:344: "busybox" [355256d8-7631-4bf3-89ea-733b0d7f70cc] Pending
helpers_test.go:344: "busybox" [355256d8-7631-4bf3-89ea-733b0d7f70cc] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:344: "busybox" [355256d8-7631-4bf3-89ea-733b0d7f70cc] Running
E1218 15:28:10.840810    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kindnet-302000/client.crt: no such file or directory
start_stop_delete_test.go:196: (dbg) TestStartStop/group/old-k8s-version/serial/DeployApp: integration-test=busybox healthy within 9.003443804s
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context old-k8s-version-920000 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/old-k8s-version/serial/DeployApp (9.31s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive (0.73s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p old-k8s-version-920000 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context old-k8s-version-920000 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive (0.73s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/Stop (8.29s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 stop -p old-k8s-version-920000 --alsologtostderr -v=3
E1218 15:28:22.745569    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/false-302000/client.crt: no such file or directory
E1218 15:28:22.751007    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/false-302000/client.crt: no such file or directory
E1218 15:28:22.763203    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/false-302000/client.crt: no such file or directory
E1218 15:28:22.785393    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/false-302000/client.crt: no such file or directory
E1218 15:28:22.825586    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/false-302000/client.crt: no such file or directory
E1218 15:28:22.905887    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/false-302000/client.crt: no such file or directory
E1218 15:28:23.066255    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/false-302000/client.crt: no such file or directory
E1218 15:28:23.386826    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/false-302000/client.crt: no such file or directory
E1218 15:28:24.028216    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/false-302000/client.crt: no such file or directory
E1218 15:28:24.244288    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/calico-302000/client.crt: no such file or directory
E1218 15:28:24.759902    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/custom-flannel-302000/client.crt: no such file or directory
E1218 15:28:25.309336    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/false-302000/client.crt: no such file or directory
start_stop_delete_test.go:228: (dbg) Done: out/minikube-darwin-amd64 stop -p old-k8s-version-920000 --alsologtostderr -v=3: (8.287107345s)
--- PASS: TestStartStop/group/old-k8s-version/serial/Stop (8.29s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop (0.34s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p old-k8s-version-920000 -n old-k8s-version-920000
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p old-k8s-version-920000 -n old-k8s-version-920000: exit status 7 (67.337229ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p old-k8s-version-920000 --images=MetricsScraper=registry.k8s.io/echoserver:1.4
--- PASS: TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop (0.34s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/SecondStart (497.48s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p old-k8s-version-920000 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=hyperkit  --kubernetes-version=v1.16.0
E1218 15:28:27.869625    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/false-302000/client.crt: no such file or directory
E1218 15:28:32.989913    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/false-302000/client.crt: no such file or directory
E1218 15:28:36.100098    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/enable-default-cni-302000/client.crt: no such file or directory
E1218 15:28:36.105258    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/enable-default-cni-302000/client.crt: no such file or directory
E1218 15:28:36.116053    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/enable-default-cni-302000/client.crt: no such file or directory
E1218 15:28:36.137633    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/enable-default-cni-302000/client.crt: no such file or directory
E1218 15:28:36.178358    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/enable-default-cni-302000/client.crt: no such file or directory
E1218 15:28:36.296406    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/enable-default-cni-302000/client.crt: no such file or directory
E1218 15:28:36.457418    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/enable-default-cni-302000/client.crt: no such file or directory
E1218 15:28:36.778632    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/enable-default-cni-302000/client.crt: no such file or directory
E1218 15:28:37.419479    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/enable-default-cni-302000/client.crt: no such file or directory
E1218 15:28:38.701610    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/enable-default-cni-302000/client.crt: no such file or directory
E1218 15:28:39.126990    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/ingress-addon-legacy-428000/client.crt: no such file or directory
E1218 15:28:41.262156    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/enable-default-cni-302000/client.crt: no such file or directory
E1218 15:28:43.230575    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/false-302000/client.crt: no such file or directory
E1218 15:28:46.383549    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/enable-default-cni-302000/client.crt: no such file or directory
E1218 15:28:56.623893    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/enable-default-cni-302000/client.crt: no such file or directory
E1218 15:29:03.711853    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/false-302000/client.crt: no such file or directory
E1218 15:29:17.105334    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/enable-default-cni-302000/client.crt: no such file or directory
E1218 15:29:20.922528    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/skaffold-080000/client.crt: no such file or directory
start_stop_delete_test.go:256: (dbg) Done: out/minikube-darwin-amd64 start -p old-k8s-version-920000 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=hyperkit  --kubernetes-version=v1.16.0: (8m17.315188256s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p old-k8s-version-920000 -n old-k8s-version-920000
--- PASS: TestStartStop/group/old-k8s-version/serial/SecondStart (497.48s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/Stop (2.24s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 stop -p no-preload-994000 --alsologtostderr -v=3
start_stop_delete_test.go:228: (dbg) Done: out/minikube-darwin-amd64 stop -p no-preload-994000 --alsologtostderr -v=3: (2.239322344s)
--- PASS: TestStartStop/group/no-preload/serial/Stop (2.24s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/EnableAddonAfterStop (0.32s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-994000 -n no-preload-994000
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-994000 -n no-preload-994000: exit status 7 (68.348521ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p no-preload-994000 --images=MetricsScraper=registry.k8s.io/echoserver:1.4
--- PASS: TestStartStop/group/no-preload/serial/EnableAddonAfterStop (0.32s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/SecondStart (106.2s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p no-preload-994000 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.29.0-rc.2
E1218 15:29:44.674340    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/false-302000/client.crt: no such file or directory
E1218 15:29:46.167104    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/calico-302000/client.crt: no such file or directory
E1218 15:29:46.682091    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/custom-flannel-302000/client.crt: no such file or directory
E1218 15:29:51.451243    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/flannel-302000/client.crt: no such file or directory
E1218 15:29:51.456353    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/flannel-302000/client.crt: no such file or directory
E1218 15:29:51.467791    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/flannel-302000/client.crt: no such file or directory
E1218 15:29:51.488551    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/flannel-302000/client.crt: no such file or directory
E1218 15:29:51.529439    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/flannel-302000/client.crt: no such file or directory
E1218 15:29:51.609905    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/flannel-302000/client.crt: no such file or directory
E1218 15:29:51.771472    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/flannel-302000/client.crt: no such file or directory
E1218 15:29:52.092912    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/flannel-302000/client.crt: no such file or directory
E1218 15:29:52.733806    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/flannel-302000/client.crt: no such file or directory
E1218 15:29:54.013953    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/flannel-302000/client.crt: no such file or directory
E1218 15:29:55.589119    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/bridge-302000/client.crt: no such file or directory
E1218 15:29:55.594922    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/bridge-302000/client.crt: no such file or directory
E1218 15:29:55.605649    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/bridge-302000/client.crt: no such file or directory
E1218 15:29:55.627267    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/bridge-302000/client.crt: no such file or directory
E1218 15:29:55.667432    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/bridge-302000/client.crt: no such file or directory
E1218 15:29:55.748521    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/bridge-302000/client.crt: no such file or directory
E1218 15:29:55.908662    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/bridge-302000/client.crt: no such file or directory
E1218 15:29:56.229114    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/bridge-302000/client.crt: no such file or directory
E1218 15:29:56.575087    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/flannel-302000/client.crt: no such file or directory
E1218 15:29:56.876000    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/bridge-302000/client.crt: no such file or directory
E1218 15:29:57.361660    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
E1218 15:29:58.066371    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/enable-default-cni-302000/client.crt: no such file or directory
E1218 15:29:58.157014    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/bridge-302000/client.crt: no such file or directory
E1218 15:30:00.717674    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/bridge-302000/client.crt: no such file or directory
E1218 15:30:01.695750    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/flannel-302000/client.crt: no such file or directory
E1218 15:30:05.837954    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/bridge-302000/client.crt: no such file or directory
E1218 15:30:11.359934    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/auto-302000/client.crt: no such file or directory
E1218 15:30:11.936035    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/flannel-302000/client.crt: no such file or directory
E1218 15:30:16.078258    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/bridge-302000/client.crt: no such file or directory
E1218 15:30:26.994961    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kindnet-302000/client.crt: no such file or directory
E1218 15:30:28.795411    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
E1218 15:30:32.417104    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/flannel-302000/client.crt: no such file or directory
E1218 15:30:36.560070    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/bridge-302000/client.crt: no such file or directory
E1218 15:30:39.054805    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/auto-302000/client.crt: no such file or directory
E1218 15:30:54.684308    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kindnet-302000/client.crt: no such file or directory
E1218 15:31:06.597188    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/false-302000/client.crt: no such file or directory
E1218 15:31:13.377910    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/flannel-302000/client.crt: no such file or directory
E1218 15:31:17.520960    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/bridge-302000/client.crt: no such file or directory
E1218 15:31:19.988377    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/enable-default-cni-302000/client.crt: no such file or directory
start_stop_delete_test.go:256: (dbg) Done: out/minikube-darwin-amd64 start -p no-preload-994000 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.29.0-rc.2: (1m46.02980152s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-994000 -n no-preload-994000
--- PASS: TestStartStop/group/no-preload/serial/SecondStart (106.20s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/UserAppExistsAfterStop (6.01s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-8694d4445c-m78nl" [6391c430-538e-44ab-b7b7-78a32743968a] Running
start_stop_delete_test.go:274: (dbg) TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 6.003887556s
--- PASS: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop (6.01s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/AddonExistsAfterStop (5.07s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/no-preload/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-8694d4445c-m78nl" [6391c430-538e-44ab-b7b7-78a32743968a] Running
start_stop_delete_test.go:287: (dbg) TestStartStop/group/no-preload/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.0043062s
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context no-preload-994000 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/no-preload/serial/AddonExistsAfterStop (5.07s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/VerifyKubernetesImages (0.16s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-darwin-amd64 -p no-preload-994000 image list --format=json
--- PASS: TestStartStop/group/no-preload/serial/VerifyKubernetesImages (0.16s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/Pause (1.83s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 pause -p no-preload-994000 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p no-preload-994000 -n no-preload-994000
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p no-preload-994000 -n no-preload-994000: exit status 2 (156.502276ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p no-preload-994000 -n no-preload-994000
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p no-preload-994000 -n no-preload-994000: exit status 2 (156.476615ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 unpause -p no-preload-994000 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p no-preload-994000 -n no-preload-994000
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p no-preload-994000 -n no-preload-994000
--- PASS: TestStartStop/group/no-preload/serial/Pause (1.83s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/FirstStart (50.26s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p embed-certs-732000 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=hyperkit  --kubernetes-version=v1.28.4
E1218 15:31:51.722821    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kubenet-302000/client.crt: no such file or directory
E1218 15:31:51.728027    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kubenet-302000/client.crt: no such file or directory
E1218 15:31:51.738133    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kubenet-302000/client.crt: no such file or directory
E1218 15:31:51.758201    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kubenet-302000/client.crt: no such file or directory
E1218 15:31:51.800193    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kubenet-302000/client.crt: no such file or directory
E1218 15:31:51.881770    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kubenet-302000/client.crt: no such file or directory
E1218 15:31:52.042990    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kubenet-302000/client.crt: no such file or directory
E1218 15:31:52.363573    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kubenet-302000/client.crt: no such file or directory
E1218 15:31:53.004209    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kubenet-302000/client.crt: no such file or directory
E1218 15:31:54.284423    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kubenet-302000/client.crt: no such file or directory
E1218 15:31:56.846043    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kubenet-302000/client.crt: no such file or directory
E1218 15:32:01.966310    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kubenet-302000/client.crt: no such file or directory
E1218 15:32:02.316430    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/calico-302000/client.crt: no such file or directory
E1218 15:32:02.839323    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/custom-flannel-302000/client.crt: no such file or directory
E1218 15:32:12.206617    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kubenet-302000/client.crt: no such file or directory
E1218 15:32:30.060477    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/calico-302000/client.crt: no such file or directory
E1218 15:32:30.526084    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/custom-flannel-302000/client.crt: no such file or directory
start_stop_delete_test.go:186: (dbg) Done: out/minikube-darwin-amd64 start -p embed-certs-732000 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=hyperkit  --kubernetes-version=v1.28.4: (50.257402083s)
--- PASS: TestStartStop/group/embed-certs/serial/FirstStart (50.26s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/DeployApp (9.24s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context embed-certs-732000 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) TestStartStop/group/embed-certs/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:344: "busybox" [c2cf01bd-ba3b-4447-8463-8fea497c6d60] Pending
helpers_test.go:344: "busybox" [c2cf01bd-ba3b-4447-8463-8fea497c6d60] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
E1218 15:32:32.688080    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kubenet-302000/client.crt: no such file or directory
helpers_test.go:344: "busybox" [c2cf01bd-ba3b-4447-8463-8fea497c6d60] Running
E1218 15:32:35.300033    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/flannel-302000/client.crt: no such file or directory
E1218 15:32:39.442870    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/bridge-302000/client.crt: no such file or directory
start_stop_delete_test.go:196: (dbg) TestStartStop/group/embed-certs/serial/DeployApp: integration-test=busybox healthy within 9.003479591s
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context embed-certs-732000 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/embed-certs/serial/DeployApp (9.24s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/EnableAddonWhileActive (0.9s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p embed-certs-732000 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context embed-certs-732000 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/embed-certs/serial/EnableAddonWhileActive (0.90s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/Stop (8.28s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 stop -p embed-certs-732000 --alsologtostderr -v=3
start_stop_delete_test.go:228: (dbg) Done: out/minikube-darwin-amd64 stop -p embed-certs-732000 --alsologtostderr -v=3: (8.278317301s)
--- PASS: TestStartStop/group/embed-certs/serial/Stop (8.28s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/EnableAddonAfterStop (0.32s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p embed-certs-732000 -n embed-certs-732000
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p embed-certs-732000 -n embed-certs-732000: exit status 7 (68.357021ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p embed-certs-732000 --images=MetricsScraper=registry.k8s.io/echoserver:1.4
--- PASS: TestStartStop/group/embed-certs/serial/EnableAddonAfterStop (0.32s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop (6s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-84b68f675b-gd2kf" [6961ba5e-9dc0-490c-9ebc-2c63426f2f91] Running
start_stop_delete_test.go:274: (dbg) TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 6.003405652s
--- PASS: TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop (6.00s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop (5.06s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-84b68f675b-gd2kf" [6961ba5e-9dc0-490c-9ebc-2c63426f2f91] Running
E1218 15:36:51.727520    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kubenet-302000/client.crt: no such file or directory
start_stop_delete_test.go:287: (dbg) TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.003971861s
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context old-k8s-version-920000 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop (5.06s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages (0.17s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-darwin-amd64 -p old-k8s-version-920000 image list --format=json
start_stop_delete_test.go:304: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
--- PASS: TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages (0.17s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/Pause (1.83s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 pause -p old-k8s-version-920000 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p old-k8s-version-920000 -n old-k8s-version-920000
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p old-k8s-version-920000 -n old-k8s-version-920000: exit status 2 (160.836853ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p old-k8s-version-920000 -n old-k8s-version-920000
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p old-k8s-version-920000 -n old-k8s-version-920000: exit status 2 (155.675875ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 unpause -p old-k8s-version-920000 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p old-k8s-version-920000 -n old-k8s-version-920000
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p old-k8s-version-920000 -n old-k8s-version-920000
--- PASS: TestStartStop/group/old-k8s-version/serial/Pause (1.83s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/Stop (1.27s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 stop -p default-k8s-diff-port-748000 --alsologtostderr -v=3
start_stop_delete_test.go:228: (dbg) Done: out/minikube-darwin-amd64 stop -p default-k8s-diff-port-748000 --alsologtostderr -v=3: (1.274564286s)
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/Stop (1.27s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/EnableAddonAfterStop (0.32s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-diff-port-748000 -n default-k8s-diff-port-748000
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-diff-port-748000 -n default-k8s-diff-port-748000: exit status 7 (68.114637ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p default-k8s-diff-port-748000 --images=MetricsScraper=registry.k8s.io/echoserver:1.4
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/EnableAddonAfterStop (0.32s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/FirstStart (594.88s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p newest-cni-511000 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=10.42.0.0/16 --driver=hyperkit  --kubernetes-version=v1.29.0-rc.2
E1218 16:04:51.515471    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/flannel-302000/client.crt: no such file or directory
E1218 16:04:54.833039    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kubenet-302000/client.crt: no such file or directory
E1218 16:04:55.653677    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/bridge-302000/client.crt: no such file or directory
E1218 16:04:57.426584    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/addons-080000/client.crt: no such file or directory
E1218 16:05:11.422856    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/auto-302000/client.crt: no such file or directory
E1218 16:05:27.058841    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kindnet-302000/client.crt: no such file or directory
E1218 16:05:28.857828    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/functional-821000/client.crt: no such file or directory
E1218 16:06:22.391946    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/no-preload-994000/client.crt: no such file or directory
E1218 16:06:51.837399    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/kubenet-302000/client.crt: no such file or directory
E1218 16:07:02.430546    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/calico-302000/client.crt: no such file or directory
E1218 16:07:02.950493    1483 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17822-999/.minikube/profiles/custom-flannel-302000/client.crt: no such file or directory
start_stop_delete_test.go:186: (dbg) Done: out/minikube-darwin-amd64 start -p newest-cni-511000 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=10.42.0.0/16 --driver=hyperkit  --kubernetes-version=v1.29.0-rc.2: (9m54.877460164s)
--- PASS: TestStartStop/group/newest-cni/serial/FirstStart (594.88s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/DeployApp (0s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/DeployApp
--- PASS: TestStartStop/group/newest-cni/serial/DeployApp (0.00s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/EnableAddonWhileActive (0.86s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p newest-cni-511000 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:211: WARNING: cni mode requires additional setup before pods can schedule :(
--- PASS: TestStartStop/group/newest-cni/serial/EnableAddonWhileActive (0.86s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/Stop (8.28s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 stop -p newest-cni-511000 --alsologtostderr -v=3
start_stop_delete_test.go:228: (dbg) Done: out/minikube-darwin-amd64 stop -p newest-cni-511000 --alsologtostderr -v=3: (8.280454729s)
--- PASS: TestStartStop/group/newest-cni/serial/Stop (8.28s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/EnableAddonAfterStop (0.32s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p newest-cni-511000 -n newest-cni-511000
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p newest-cni-511000 -n newest-cni-511000: exit status 7 (68.231241ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p newest-cni-511000 --images=MetricsScraper=registry.k8s.io/echoserver:1.4
--- PASS: TestStartStop/group/newest-cni/serial/EnableAddonAfterStop (0.32s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/SecondStart (35.7s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p newest-cni-511000 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=10.42.0.0/16 --driver=hyperkit  --kubernetes-version=v1.29.0-rc.2
start_stop_delete_test.go:256: (dbg) Done: out/minikube-darwin-amd64 start -p newest-cni-511000 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=10.42.0.0/16 --driver=hyperkit  --kubernetes-version=v1.29.0-rc.2: (35.53796541s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p newest-cni-511000 -n newest-cni-511000
--- PASS: TestStartStop/group/newest-cni/serial/SecondStart (35.70s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop (0s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop
start_stop_delete_test.go:273: WARNING: cni mode requires additional setup before pods can schedule :(
--- PASS: TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop (0.00s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/AddonExistsAfterStop (0s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/AddonExistsAfterStop
start_stop_delete_test.go:284: WARNING: cni mode requires additional setup before pods can schedule :(
--- PASS: TestStartStop/group/newest-cni/serial/AddonExistsAfterStop (0.00s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/VerifyKubernetesImages (0.17s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-darwin-amd64 -p newest-cni-511000 image list --format=json
--- PASS: TestStartStop/group/newest-cni/serial/VerifyKubernetesImages (0.17s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/Pause (1.74s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 pause -p newest-cni-511000 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p newest-cni-511000 -n newest-cni-511000
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p newest-cni-511000 -n newest-cni-511000: exit status 2 (163.607714ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p newest-cni-511000 -n newest-cni-511000
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p newest-cni-511000 -n newest-cni-511000: exit status 2 (163.599071ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 unpause -p newest-cni-511000 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p newest-cni-511000 -n newest-cni-511000
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p newest-cni-511000 -n newest-cni-511000
--- PASS: TestStartStop/group/newest-cni/serial/Pause (1.74s)

                                                
                                    

Test skip (21/318)

x
+
TestDownloadOnly/v1.16.0/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/cached-images
aaa_download_only_test.go:117: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.16.0/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.16.0/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/binaries
aaa_download_only_test.go:139: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.16.0/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.4/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.4/cached-images
aaa_download_only_test.go:117: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.28.4/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.4/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.4/binaries
aaa_download_only_test.go:139: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.28.4/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.29.0-rc.2/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.29.0-rc.2/cached-images
aaa_download_only_test.go:117: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.29.0-rc.2/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.29.0-rc.2/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.29.0-rc.2/binaries
aaa_download_only_test.go:139: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.29.0-rc.2/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnlyKic (0s)

                                                
                                                
=== RUN   TestDownloadOnlyKic
aaa_download_only_test.go:213: skipping, only for docker or podman driver
--- SKIP: TestDownloadOnlyKic (0.00s)

                                                
                                    
x
+
TestAddons/parallel/Olm (0s)

                                                
                                                
=== RUN   TestAddons/parallel/Olm
=== PAUSE TestAddons/parallel/Olm

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Olm
addons_test.go:497: Skipping OLM addon test until https://github.com/operator-framework/operator-lifecycle-manager/issues/2534 is resolved
--- SKIP: TestAddons/parallel/Olm (0.00s)

                                                
                                    
x
+
TestDockerEnvContainerd (0s)

                                                
                                                
=== RUN   TestDockerEnvContainerd
docker_test.go:170: running with docker false darwin amd64
docker_test.go:172: skipping: TestDockerEnvContainerd can only be run with the containerd runtime on Docker driver
--- SKIP: TestDockerEnvContainerd (0.00s)

                                                
                                    
x
+
TestKVMDriverInstallOrUpdate (0s)

                                                
                                                
=== RUN   TestKVMDriverInstallOrUpdate
driver_install_or_update_test.go:41: Skip if not linux.
--- SKIP: TestKVMDriverInstallOrUpdate (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/PodmanEnv (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/PodmanEnv
=== PAUSE TestFunctional/parallel/PodmanEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PodmanEnv
functional_test.go:546: only validate podman env with docker container runtime, currently testing docker
--- SKIP: TestFunctional/parallel/PodmanEnv (0.00s)

                                                
                                    
x
+
TestGvisorAddon (0s)

                                                
                                                
=== RUN   TestGvisorAddon
gvisor_addon_test.go:34: skipping test because --gvisor=false
--- SKIP: TestGvisorAddon (0.00s)

                                                
                                    
x
+
TestKicCustomNetwork (0s)

                                                
                                                
=== RUN   TestKicCustomNetwork
kic_custom_network_test.go:34: only runs with docker driver
--- SKIP: TestKicCustomNetwork (0.00s)

                                                
                                    
x
+
TestKicExistingNetwork (0s)

                                                
                                                
=== RUN   TestKicExistingNetwork
kic_custom_network_test.go:73: only runs with docker driver
--- SKIP: TestKicExistingNetwork (0.00s)

                                                
                                    
x
+
TestKicCustomSubnet (0s)

                                                
                                                
=== RUN   TestKicCustomSubnet
kic_custom_network_test.go:102: only runs with docker/podman driver
--- SKIP: TestKicCustomSubnet (0.00s)

                                                
                                    
x
+
TestKicStaticIP (0s)

                                                
                                                
=== RUN   TestKicStaticIP
kic_custom_network_test.go:123: only run with docker/podman driver
--- SKIP: TestKicStaticIP (0.00s)

                                                
                                    
x
+
TestScheduledStopWindows (0s)

                                                
                                                
=== RUN   TestScheduledStopWindows
scheduled_stop_test.go:42: test only runs on windows
--- SKIP: TestScheduledStopWindows (0.00s)

                                                
                                    
x
+
TestInsufficientStorage (0s)

                                                
                                                
=== RUN   TestInsufficientStorage
status_test.go:38: only runs with docker driver
--- SKIP: TestInsufficientStorage (0.00s)

                                                
                                    
x
+
TestMissingContainerUpgrade (0s)

                                                
                                                
=== RUN   TestMissingContainerUpgrade
version_upgrade_test.go:297: This test is only for Docker
--- SKIP: TestMissingContainerUpgrade (0.00s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium (5.92s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium
net_test.go:102: Skipping the test as it's interfering with other tests and is outdated
panic.go:523: 
----------------------- debugLogs start: cilium-302000 [pass: true] --------------------------------
>>> netcat: nslookup kubernetes.default:
Error in configuration: context was not found for specified context: cilium-302000

                                                
                                                

                                                
                                                
>>> netcat: nslookup debug kubernetes.default a-records:
Error in configuration: context was not found for specified context: cilium-302000

                                                
                                                

                                                
                                                
>>> netcat: dig search kubernetes.default:
Error in configuration: context was not found for specified context: cilium-302000

                                                
                                                

                                                
                                                
>>> netcat: dig @10.96.0.10 kubernetes.default.svc.cluster.local udp/53:
Error in configuration: context was not found for specified context: cilium-302000

                                                
                                                

                                                
                                                
>>> netcat: dig @10.96.0.10 kubernetes.default.svc.cluster.local tcp/53:
Error in configuration: context was not found for specified context: cilium-302000

                                                
                                                

                                                
                                                
>>> netcat: nc 10.96.0.10 udp/53:
Error in configuration: context was not found for specified context: cilium-302000

                                                
                                                

                                                
                                                
>>> netcat: nc 10.96.0.10 tcp/53:
Error in configuration: context was not found for specified context: cilium-302000

                                                
                                                

                                                
                                                
>>> netcat: /etc/nsswitch.conf:
Error in configuration: context was not found for specified context: cilium-302000

                                                
                                                

                                                
                                                
>>> netcat: /etc/hosts:
Error in configuration: context was not found for specified context: cilium-302000

                                                
                                                

                                                
                                                
>>> netcat: /etc/resolv.conf:
Error in configuration: context was not found for specified context: cilium-302000

                                                
                                                

                                                
                                                
>>> host: /etc/nsswitch.conf:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> host: /etc/hosts:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> host: /etc/resolv.conf:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> k8s: nodes, services, endpoints, daemon sets, deployments and pods, :
Error in configuration: context was not found for specified context: cilium-302000

                                                
                                                

                                                
                                                
>>> host: crictl pods:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> host: crictl containers:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> k8s: describe netcat deployment:
error: context "cilium-302000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe netcat pod(s):
error: context "cilium-302000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: netcat logs:
error: context "cilium-302000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe coredns deployment:
error: context "cilium-302000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe coredns pods:
error: context "cilium-302000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: coredns logs:
error: context "cilium-302000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe api server pod(s):
error: context "cilium-302000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: api server logs:
error: context "cilium-302000" does not exist

                                                
                                                

                                                
                                                
>>> host: /etc/cni:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> host: ip a s:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> host: ip r s:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> host: iptables-save:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> host: iptables table nat:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> k8s: describe cilium daemon set:
Error in configuration: context was not found for specified context: cilium-302000

                                                
                                                

                                                
                                                
>>> k8s: describe cilium daemon set pod(s):
Error in configuration: context was not found for specified context: cilium-302000

                                                
                                                

                                                
                                                
>>> k8s: cilium daemon set container(s) logs (current):
error: context "cilium-302000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: cilium daemon set container(s) logs (previous):
error: context "cilium-302000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe cilium deployment:
Error in configuration: context was not found for specified context: cilium-302000

                                                
                                                

                                                
                                                
>>> k8s: describe cilium deployment pod(s):
Error in configuration: context was not found for specified context: cilium-302000

                                                
                                                

                                                
                                                
>>> k8s: cilium deployment container(s) logs (current):
error: context "cilium-302000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: cilium deployment container(s) logs (previous):
error: context "cilium-302000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe kube-proxy daemon set:
error: context "cilium-302000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe kube-proxy pod(s):
error: context "cilium-302000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: kube-proxy logs:
error: context "cilium-302000" does not exist

                                                
                                                

                                                
                                                
>>> host: kubelet daemon status:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> host: kubelet daemon config:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> k8s: kubelet logs:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> host: /etc/kubernetes/kubelet.conf:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> host: /var/lib/kubelet/config.yaml:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> k8s: kubectl config:
apiVersion: v1
clusters: null
contexts: null
current-context: ""
kind: Config
preferences: {}
users: null

                                                
                                                

                                                
                                                
>>> k8s: cms:
Error in configuration: context was not found for specified context: cilium-302000

                                                
                                                

                                                
                                                
>>> host: docker daemon status:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> host: docker daemon config:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> host: /etc/docker/daemon.json:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> host: docker system info:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> host: cri-docker daemon status:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> host: cri-docker daemon config:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> host: /etc/systemd/system/cri-docker.service.d/10-cni.conf:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> host: /usr/lib/systemd/system/cri-docker.service:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> host: cri-dockerd version:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> host: containerd daemon status:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> host: containerd daemon config:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> host: /lib/systemd/system/containerd.service:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> host: /etc/containerd/config.toml:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> host: containerd config dump:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> host: crio daemon status:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> host: crio daemon config:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> host: /etc/crio:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                

                                                
                                                
>>> host: crio config:
* Profile "cilium-302000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-302000"

                                                
                                                
----------------------- debugLogs end: cilium-302000 [took: 5.535095802s] --------------------------------
helpers_test.go:175: Cleaning up "cilium-302000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p cilium-302000
--- SKIP: TestNetworkPlugins/group/cilium (5.92s)

                                                
                                    
x
+
TestStartStop/group/disable-driver-mounts (0.39s)

                                                
                                                
=== RUN   TestStartStop/group/disable-driver-mounts
=== PAUSE TestStartStop/group/disable-driver-mounts

                                                
                                                

                                                
                                                
=== CONT  TestStartStop/group/disable-driver-mounts
start_stop_delete_test.go:103: skipping TestStartStop/group/disable-driver-mounts - only runs on virtualbox
helpers_test.go:175: Cleaning up "disable-driver-mounts-813000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p disable-driver-mounts-813000
--- SKIP: TestStartStop/group/disable-driver-mounts (0.39s)

                                                
                                    
Copied to clipboard