Test Report: Hyperkit_macOS 15411

                    
                      c76b70067f4d0a064aa8eab45220402e7d36e357:2022-11-28:26757
                    
                

Test fail (2/301)

Order failed test Duration
243 TestPause/serial/SecondStartNoReconfiguration 47.9
314 TestNetworkPlugins/group/kubenet/HairPin 53.96
x
+
TestPause/serial/SecondStartNoReconfiguration (47.9s)

                                                
                                                
=== RUN   TestPause/serial/SecondStartNoReconfiguration
pause_test.go:92: (dbg) Run:  out/minikube-darwin-amd64 start -p pause-114428 --alsologtostderr -v=1 --driver=hyperkit 

                                                
                                                
=== CONT  TestPause/serial/SecondStartNoReconfiguration
pause_test.go:92: (dbg) Done: out/minikube-darwin-amd64 start -p pause-114428 --alsologtostderr -v=1 --driver=hyperkit : (40.48600042s)
pause_test.go:100: expected the second start log output to include "The running cluster does not require reconfiguration" but got: 
-- stdout --
	* [pause-114428] minikube v1.28.0 on Darwin 13.0.1
	  - MINIKUBE_LOCATION=15411
	  - KUBECONFIG=/Users/jenkins/minikube-integration/15411-14646/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/15411-14646/.minikube
	* Using the hyperkit driver based on existing profile
	* Starting control plane node pause-114428 in cluster pause-114428
	* Updating the running hyperkit "pause-114428" VM ...
	* Preparing Kubernetes v1.25.3 on Docker 20.10.21 ...
	* Verifying Kubernetes components...
	  - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	* Enabled addons: storage-provisioner, default-storageclass
	* Done! kubectl is now configured to use "pause-114428" cluster and "default" namespace by default

                                                
                                                
-- /stdout --
** stderr ** 
	I1128 11:45:29.342563   23619 out.go:296] Setting OutFile to fd 1 ...
	I1128 11:45:29.342789   23619 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1128 11:45:29.342794   23619 out.go:309] Setting ErrFile to fd 2...
	I1128 11:45:29.342799   23619 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1128 11:45:29.342921   23619 root.go:334] Updating PATH: /Users/jenkins/minikube-integration/15411-14646/.minikube/bin
	I1128 11:45:29.343489   23619 out.go:303] Setting JSON to false
	I1128 11:45:29.365371   23619 start.go:124] hostinfo: {"hostname":"MacOS-Agent-2.local","uptime":9904,"bootTime":1669654825,"procs":392,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"13.0.1","kernelVersion":"22.1.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"2965c349-98a5-5970-aaa9-9eedd3ae5959"}
	W1128 11:45:29.365460   23619 start.go:132] gopshost.Virtualization returned error: not implemented yet
	I1128 11:45:29.386589   23619 out.go:177] * [pause-114428] minikube v1.28.0 on Darwin 13.0.1
	I1128 11:45:29.428385   23619 notify.go:220] Checking for updates...
	I1128 11:45:29.449266   23619 out.go:177]   - MINIKUBE_LOCATION=15411
	I1128 11:45:29.470480   23619 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/15411-14646/kubeconfig
	I1128 11:45:29.491577   23619 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I1128 11:45:29.512408   23619 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1128 11:45:29.533501   23619 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/15411-14646/.minikube
	I1128 11:45:29.554902   23619 config.go:180] Loaded profile config "pause-114428": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.3
	I1128 11:45:29.555299   23619 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:45:29.555348   23619 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 11:45:29.562613   23619 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58619
	I1128 11:45:29.563077   23619 main.go:134] libmachine: () Calling .GetVersion
	I1128 11:45:29.563559   23619 main.go:134] libmachine: Using API Version  1
	I1128 11:45:29.563572   23619 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 11:45:29.563830   23619 main.go:134] libmachine: () Calling .GetMachineName
	I1128 11:45:29.563938   23619 main.go:134] libmachine: (pause-114428) Calling .DriverName
	I1128 11:45:29.564076   23619 driver.go:365] Setting default libvirt URI to qemu:///system
	I1128 11:45:29.564349   23619 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:45:29.564376   23619 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 11:45:29.571568   23619 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58621
	I1128 11:45:29.571930   23619 main.go:134] libmachine: () Calling .GetVersion
	I1128 11:45:29.572297   23619 main.go:134] libmachine: Using API Version  1
	I1128 11:45:29.572313   23619 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 11:45:29.572502   23619 main.go:134] libmachine: () Calling .GetMachineName
	I1128 11:45:29.572591   23619 main.go:134] libmachine: (pause-114428) Calling .DriverName
	I1128 11:45:29.600404   23619 out.go:177] * Using the hyperkit driver based on existing profile
	I1128 11:45:29.621471   23619 start.go:293] selected driver: hyperkit
	I1128 11:45:29.621502   23619 start.go:837] validating driver "hyperkit" against &{Name:pause-114428 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/15235/minikube-v1.28.0-1668700269-15235-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.36-1668787669-15272@sha256:06094fc04b5dc02fbf1e2de7723c2a6db5d24c21fd2ddda91f6daaf29038cd9c Memory:2048 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesCon
fig:{KubernetesVersion:v1.25.3 ClusterName:pause-114428 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP:192.168.64.71 Port:8443 KubernetesVersion:v1.25.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:2621
44 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath:/opt/socket_vmnet/bin/socket_vmnet_client SocketVMnetPath:/var/run/socket_vmnet}
	I1128 11:45:29.621619   23619 start.go:848] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1128 11:45:29.621686   23619 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1128 11:45:29.621803   23619 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/15411-14646/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I1128 11:45:29.629339   23619 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.28.0
	I1128 11:45:29.632843   23619 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:45:29.632863   23619 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I1128 11:45:29.635292   23619 cni.go:95] Creating CNI manager for ""
	I1128 11:45:29.635309   23619 cni.go:169] CNI unnecessary in this configuration, recommending no CNI
	I1128 11:45:29.635327   23619 start_flags.go:317] config:
	{Name:pause-114428 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/15235/minikube-v1.28.0-1668700269-15235-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.36-1668787669-15272@sha256:06094fc04b5dc02fbf1e2de7723c2a6db5d24c21fd2ddda91f6daaf29038cd9c Memory:2048 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.25.3 ClusterName:pause-114428 Namespace:default APIServe
rName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP:192.168.64.71 Port:8443 KubernetesVersion:v1.25.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableO
ptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath:/opt/socket_vmnet/bin/socket_vmnet_client SocketVMnetPath:/var/run/socket_vmnet}
	I1128 11:45:29.635472   23619 iso.go:125] acquiring lock: {Name:mkf8786ebc65c7c4a918cffd312ffffda2a4bd0b Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1128 11:45:29.677442   23619 out.go:177] * Starting control plane node pause-114428 in cluster pause-114428
	I1128 11:45:29.719354   23619 preload.go:132] Checking if preload exists for k8s version v1.25.3 and runtime docker
	I1128 11:45:29.719414   23619 preload.go:148] Found local preload: /Users/jenkins/minikube-integration/15411-14646/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.25.3-docker-overlay2-amd64.tar.lz4
	I1128 11:45:29.719438   23619 cache.go:57] Caching tarball of preloaded images
	I1128 11:45:29.719559   23619 preload.go:174] Found /Users/jenkins/minikube-integration/15411-14646/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.25.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I1128 11:45:29.719572   23619 cache.go:60] Finished verifying existence of preloaded tar for  v1.25.3 on docker
	I1128 11:45:29.719668   23619 profile.go:148] Saving config to /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/pause-114428/config.json ...
	I1128 11:45:29.720033   23619 cache.go:208] Successfully downloaded all kic artifacts
	I1128 11:45:29.720060   23619 start.go:364] acquiring machines lock for pause-114428: {Name:mk027eaad0dbb84f6e95336dab244cf2d7aaac44 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I1128 11:45:31.836773   23619 start.go:368] acquired machines lock for "pause-114428" in 2.116639364s
	I1128 11:45:31.836818   23619 start.go:96] Skipping create...Using existing machine configuration
	I1128 11:45:31.836835   23619 fix.go:55] fixHost starting: 
	I1128 11:45:31.837123   23619 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:45:31.837149   23619 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 11:45:31.844330   23619 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58644
	I1128 11:45:31.844735   23619 main.go:134] libmachine: () Calling .GetVersion
	I1128 11:45:31.845054   23619 main.go:134] libmachine: Using API Version  1
	I1128 11:45:31.845065   23619 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 11:45:31.845281   23619 main.go:134] libmachine: () Calling .GetMachineName
	I1128 11:45:31.845368   23619 main.go:134] libmachine: (pause-114428) Calling .DriverName
	I1128 11:45:31.845453   23619 main.go:134] libmachine: (pause-114428) Calling .GetState
	I1128 11:45:31.845539   23619 main.go:134] libmachine: (pause-114428) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:45:31.845648   23619 main.go:134] libmachine: (pause-114428) DBG | hyperkit pid from json: 23529
	I1128 11:45:31.846518   23619 fix.go:103] recreateIfNeeded on pause-114428: state=Running err=<nil>
	W1128 11:45:31.846533   23619 fix.go:129] unexpected machine state, will restart: <nil>
	I1128 11:45:31.897301   23619 out.go:177] * Updating the running hyperkit "pause-114428" VM ...
	I1128 11:45:31.918698   23619 machine.go:88] provisioning docker machine ...
	I1128 11:45:31.918735   23619 main.go:134] libmachine: (pause-114428) Calling .DriverName
	I1128 11:45:31.919206   23619 main.go:134] libmachine: (pause-114428) Calling .GetMachineName
	I1128 11:45:31.919526   23619 buildroot.go:166] provisioning hostname "pause-114428"
	I1128 11:45:31.919558   23619 main.go:134] libmachine: (pause-114428) Calling .GetMachineName
	I1128 11:45:31.919809   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHHostname
	I1128 11:45:31.920040   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHPort
	I1128 11:45:31.920333   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHKeyPath
	I1128 11:45:31.920584   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHKeyPath
	I1128 11:45:31.920843   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHUsername
	I1128 11:45:31.921051   23619 main.go:134] libmachine: Using SSH client type: native
	I1128 11:45:31.921282   23619 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e7180] 0x13ea300 <nil>  [] 0s} 192.168.64.71 22 <nil> <nil>}
	I1128 11:45:31.921298   23619 main.go:134] libmachine: About to run SSH command:
	sudo hostname pause-114428 && echo "pause-114428" | sudo tee /etc/hostname
	I1128 11:45:32.002348   23619 main.go:134] libmachine: SSH cmd err, output: <nil>: pause-114428
	
	I1128 11:45:32.002368   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHHostname
	I1128 11:45:32.002511   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHPort
	I1128 11:45:32.002618   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHKeyPath
	I1128 11:45:32.002718   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHKeyPath
	I1128 11:45:32.002822   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHUsername
	I1128 11:45:32.002963   23619 main.go:134] libmachine: Using SSH client type: native
	I1128 11:45:32.003089   23619 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e7180] 0x13ea300 <nil>  [] 0s} 192.168.64.71 22 <nil> <nil>}
	I1128 11:45:32.003101   23619 main.go:134] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\spause-114428' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 pause-114428/g' /etc/hosts;
				else 
					echo '127.0.1.1 pause-114428' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1128 11:45:32.089068   23619 main.go:134] libmachine: SSH cmd err, output: <nil>: 
	I1128 11:45:32.089090   23619 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/15411-14646/.minikube CaCertPath:/Users/jenkins/minikube-integration/15411-14646/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/15411-14646/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/15411-14646/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/15411-14646/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/15411-14646/.minikube}
	I1128 11:45:32.089105   23619 buildroot.go:174] setting up certificates
	I1128 11:45:32.089116   23619 provision.go:83] configureAuth start
	I1128 11:45:32.089124   23619 main.go:134] libmachine: (pause-114428) Calling .GetMachineName
	I1128 11:45:32.089279   23619 main.go:134] libmachine: (pause-114428) Calling .GetIP
	I1128 11:45:32.089372   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHHostname
	I1128 11:45:32.089450   23619 provision.go:138] copyHostCerts
	I1128 11:45:32.089545   23619 exec_runner.go:144] found /Users/jenkins/minikube-integration/15411-14646/.minikube/ca.pem, removing ...
	I1128 11:45:32.089555   23619 exec_runner.go:207] rm: /Users/jenkins/minikube-integration/15411-14646/.minikube/ca.pem
	I1128 11:45:32.089660   23619 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/15411-14646/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/15411-14646/.minikube/ca.pem (1078 bytes)
	I1128 11:45:32.089853   23619 exec_runner.go:144] found /Users/jenkins/minikube-integration/15411-14646/.minikube/cert.pem, removing ...
	I1128 11:45:32.089859   23619 exec_runner.go:207] rm: /Users/jenkins/minikube-integration/15411-14646/.minikube/cert.pem
	I1128 11:45:32.089918   23619 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/15411-14646/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/15411-14646/.minikube/cert.pem (1123 bytes)
	I1128 11:45:32.090071   23619 exec_runner.go:144] found /Users/jenkins/minikube-integration/15411-14646/.minikube/key.pem, removing ...
	I1128 11:45:32.090077   23619 exec_runner.go:207] rm: /Users/jenkins/minikube-integration/15411-14646/.minikube/key.pem
	I1128 11:45:32.090137   23619 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/15411-14646/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/15411-14646/.minikube/key.pem (1679 bytes)
	I1128 11:45:32.090254   23619 provision.go:112] generating server cert: /Users/jenkins/minikube-integration/15411-14646/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/15411-14646/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/15411-14646/.minikube/certs/ca-key.pem org=jenkins.pause-114428 san=[192.168.64.71 192.168.64.71 localhost 127.0.0.1 minikube pause-114428]
	I1128 11:45:32.264936   23619 provision.go:172] copyRemoteCerts
	I1128 11:45:32.265004   23619 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1128 11:45:32.265033   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHHostname
	I1128 11:45:32.265178   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHPort
	I1128 11:45:32.265273   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHKeyPath
	I1128 11:45:32.265357   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHUsername
	I1128 11:45:32.265453   23619 sshutil.go:53] new ssh client: &{IP:192.168.64.71 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/pause-114428/id_rsa Username:docker}
	I1128 11:45:32.310124   23619 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/15411-14646/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1128 11:45:32.333390   23619 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/15411-14646/.minikube/machines/server.pem --> /etc/docker/server.pem (1216 bytes)
	I1128 11:45:32.354279   23619 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/15411-14646/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1128 11:45:32.378940   23619 provision.go:86] duration metric: configureAuth took 289.805316ms
	I1128 11:45:32.378953   23619 buildroot.go:189] setting minikube options for container-runtime
	I1128 11:45:32.379092   23619 config.go:180] Loaded profile config "pause-114428": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.3
	I1128 11:45:32.379106   23619 main.go:134] libmachine: (pause-114428) Calling .DriverName
	I1128 11:45:32.379241   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHHostname
	I1128 11:45:32.379317   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHPort
	I1128 11:45:32.379383   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHKeyPath
	I1128 11:45:32.379458   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHKeyPath
	I1128 11:45:32.379532   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHUsername
	I1128 11:45:32.379678   23619 main.go:134] libmachine: Using SSH client type: native
	I1128 11:45:32.379782   23619 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e7180] 0x13ea300 <nil>  [] 0s} 192.168.64.71 22 <nil> <nil>}
	I1128 11:45:32.379791   23619 main.go:134] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I1128 11:45:32.450940   23619 main.go:134] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I1128 11:45:32.450959   23619 buildroot.go:70] root file system type: tmpfs
	I1128 11:45:32.451104   23619 provision.go:309] Updating docker unit: /lib/systemd/system/docker.service ...
	I1128 11:45:32.451121   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHHostname
	I1128 11:45:32.451249   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHPort
	I1128 11:45:32.451340   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHKeyPath
	I1128 11:45:32.451444   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHKeyPath
	I1128 11:45:32.451531   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHUsername
	I1128 11:45:32.451668   23619 main.go:134] libmachine: Using SSH client type: native
	I1128 11:45:32.451787   23619 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e7180] 0x13ea300 <nil>  [] 0s} 192.168.64.71 22 <nil> <nil>}
	I1128 11:45:32.451833   23619 main.go:134] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I1128 11:45:32.529001   23619 main.go:134] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I1128 11:45:32.529025   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHHostname
	I1128 11:45:32.529167   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHPort
	I1128 11:45:32.529252   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHKeyPath
	I1128 11:45:32.529350   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHKeyPath
	I1128 11:45:32.529454   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHUsername
	I1128 11:45:32.529596   23619 main.go:134] libmachine: Using SSH client type: native
	I1128 11:45:32.529718   23619 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e7180] 0x13ea300 <nil>  [] 0s} 192.168.64.71 22 <nil> <nil>}
	I1128 11:45:32.529731   23619 main.go:134] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I1128 11:45:32.603483   23619 main.go:134] libmachine: SSH cmd err, output: <nil>: 
	I1128 11:45:32.603496   23619 machine.go:91] provisioned docker machine in 684.761925ms
	I1128 11:45:32.603511   23619 start.go:300] post-start starting for "pause-114428" (driver="hyperkit")
	I1128 11:45:32.603516   23619 start.go:328] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1128 11:45:32.603528   23619 main.go:134] libmachine: (pause-114428) Calling .DriverName
	I1128 11:45:32.603721   23619 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1128 11:45:32.603734   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHHostname
	I1128 11:45:32.603827   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHPort
	I1128 11:45:32.603912   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHKeyPath
	I1128 11:45:32.603994   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHUsername
	I1128 11:45:32.604075   23619 sshutil.go:53] new ssh client: &{IP:192.168.64.71 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/pause-114428/id_rsa Username:docker}
	I1128 11:45:32.643599   23619 ssh_runner.go:195] Run: cat /etc/os-release
	I1128 11:45:32.646380   23619 info.go:137] Remote host: Buildroot 2021.02.12
	I1128 11:45:32.646394   23619 filesync.go:126] Scanning /Users/jenkins/minikube-integration/15411-14646/.minikube/addons for local assets ...
	I1128 11:45:32.646483   23619 filesync.go:126] Scanning /Users/jenkins/minikube-integration/15411-14646/.minikube/files for local assets ...
	I1128 11:45:32.646647   23619 filesync.go:149] local asset: /Users/jenkins/minikube-integration/15411-14646/.minikube/files/etc/ssl/certs/158232.pem -> 158232.pem in /etc/ssl/certs
	I1128 11:45:32.646822   23619 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I1128 11:45:32.652770   23619 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/15411-14646/.minikube/files/etc/ssl/certs/158232.pem --> /etc/ssl/certs/158232.pem (1708 bytes)
	I1128 11:45:32.669022   23619 start.go:303] post-start completed in 65.501467ms
	I1128 11:45:32.669040   23619 fix.go:57] fixHost completed within 832.193225ms
	I1128 11:45:32.669054   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHHostname
	I1128 11:45:32.669186   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHPort
	I1128 11:45:32.669276   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHKeyPath
	I1128 11:45:32.669366   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHKeyPath
	I1128 11:45:32.669449   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHUsername
	I1128 11:45:32.669575   23619 main.go:134] libmachine: Using SSH client type: native
	I1128 11:45:32.669690   23619 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e7180] 0x13ea300 <nil>  [] 0s} 192.168.64.71 22 <nil> <nil>}
	I1128 11:45:32.669698   23619 main.go:134] libmachine: About to run SSH command:
	date +%s.%N
	I1128 11:45:32.737920   23619 main.go:134] libmachine: SSH cmd err, output: <nil>: 1669664732.815589153
	
	I1128 11:45:32.737931   23619 fix.go:207] guest clock: 1669664732.815589153
	I1128 11:45:32.737951   23619 fix.go:220] Guest: 2022-11-28 11:45:32.815589153 -0800 PST Remote: 2022-11-28 11:45:32.669042 -0800 PST m=+3.377067212 (delta=146.547153ms)
	I1128 11:45:32.737972   23619 fix.go:191] guest clock delta is within tolerance: 146.547153ms
	I1128 11:45:32.737976   23619 start.go:83] releasing machines lock for "pause-114428", held for 901.169251ms
	I1128 11:45:32.737995   23619 main.go:134] libmachine: (pause-114428) Calling .DriverName
	I1128 11:45:32.738136   23619 main.go:134] libmachine: (pause-114428) Calling .GetIP
	I1128 11:45:32.738226   23619 main.go:134] libmachine: (pause-114428) Calling .DriverName
	I1128 11:45:32.738518   23619 main.go:134] libmachine: (pause-114428) Calling .DriverName
	I1128 11:45:32.738626   23619 main.go:134] libmachine: (pause-114428) Calling .DriverName
	I1128 11:45:32.738716   23619 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1128 11:45:32.738748   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHHostname
	I1128 11:45:32.738766   23619 ssh_runner.go:195] Run: cat /version.json
	I1128 11:45:32.738784   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHHostname
	I1128 11:45:32.738900   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHPort
	I1128 11:45:32.738910   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHPort
	I1128 11:45:32.738993   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHKeyPath
	I1128 11:45:32.739021   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHKeyPath
	I1128 11:45:32.739072   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHUsername
	I1128 11:45:32.739098   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHUsername
	I1128 11:45:32.739152   23619 sshutil.go:53] new ssh client: &{IP:192.168.64.71 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/pause-114428/id_rsa Username:docker}
	I1128 11:45:32.739169   23619 sshutil.go:53] new ssh client: &{IP:192.168.64.71 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/pause-114428/id_rsa Username:docker}
	I1128 11:45:32.812312   23619 ssh_runner.go:195] Run: systemctl --version
	I1128 11:45:32.816425   23619 preload.go:132] Checking if preload exists for k8s version v1.25.3 and runtime docker
	I1128 11:45:32.816541   23619 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I1128 11:45:32.833592   23619 docker.go:613] Got preloaded images: -- stdout --
	registry.k8s.io/kube-apiserver:v1.25.3
	registry.k8s.io/kube-controller-manager:v1.25.3
	registry.k8s.io/kube-scheduler:v1.25.3
	registry.k8s.io/kube-proxy:v1.25.3
	registry.k8s.io/pause:3.8
	registry.k8s.io/etcd:3.5.4-0
	registry.k8s.io/coredns/coredns:v1.9.3
	k8s.gcr.io/pause:3.6
	gcr.io/k8s-minikube/storage-provisioner:v5
	
	-- /stdout --
	I1128 11:45:32.833610   23619 docker.go:543] Images already preloaded, skipping extraction
	I1128 11:45:32.833683   23619 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I1128 11:45:32.842631   23619 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I1128 11:45:32.853276   23619 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1128 11:45:32.862019   23619 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	image-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1128 11:45:32.874649   23619 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I1128 11:45:33.050986   23619 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I1128 11:45:33.193019   23619 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1128 11:45:33.313690   23619 ssh_runner.go:195] Run: sudo systemctl restart docker
	I1128 11:45:40.746603   23619 ssh_runner.go:235] Completed: sudo systemctl restart docker: (7.432732748s)
	I1128 11:45:40.746686   23619 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I1128 11:45:40.846433   23619 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1128 11:45:40.932079   23619 ssh_runner.go:195] Run: sudo systemctl start cri-docker.socket
	I1128 11:45:40.942491   23619 start.go:451] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I1128 11:45:40.942578   23619 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I1128 11:45:40.946602   23619 start.go:472] Will wait 60s for crictl version
	I1128 11:45:40.946651   23619 ssh_runner.go:195] Run: sudo crictl version
	I1128 11:45:40.969983   23619 start.go:481] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  20.10.21
	RuntimeApiVersion:  1.41.0
	I1128 11:45:40.970072   23619 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I1128 11:45:40.991442   23619 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I1128 11:45:41.036080   23619 out.go:204] * Preparing Kubernetes v1.25.3 on Docker 20.10.21 ...
	I1128 11:45:41.036199   23619 ssh_runner.go:195] Run: grep 192.168.64.1	host.minikube.internal$ /etc/hosts
	I1128 11:45:41.039907   23619 preload.go:132] Checking if preload exists for k8s version v1.25.3 and runtime docker
	I1128 11:45:41.039982   23619 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I1128 11:45:41.056132   23619 docker.go:613] Got preloaded images: -- stdout --
	registry.k8s.io/kube-apiserver:v1.25.3
	registry.k8s.io/kube-controller-manager:v1.25.3
	registry.k8s.io/kube-scheduler:v1.25.3
	registry.k8s.io/kube-proxy:v1.25.3
	registry.k8s.io/pause:3.8
	registry.k8s.io/etcd:3.5.4-0
	registry.k8s.io/coredns/coredns:v1.9.3
	k8s.gcr.io/pause:3.6
	gcr.io/k8s-minikube/storage-provisioner:v5
	
	-- /stdout --
	I1128 11:45:41.056146   23619 docker.go:543] Images already preloaded, skipping extraction
	I1128 11:45:41.056238   23619 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I1128 11:45:41.073931   23619 docker.go:613] Got preloaded images: -- stdout --
	registry.k8s.io/kube-apiserver:v1.25.3
	registry.k8s.io/kube-scheduler:v1.25.3
	registry.k8s.io/kube-controller-manager:v1.25.3
	registry.k8s.io/kube-proxy:v1.25.3
	registry.k8s.io/pause:3.8
	registry.k8s.io/etcd:3.5.4-0
	registry.k8s.io/coredns/coredns:v1.9.3
	k8s.gcr.io/pause:3.6
	gcr.io/k8s-minikube/storage-provisioner:v5
	
	-- /stdout --
	I1128 11:45:41.073948   23619 cache_images.go:84] Images are preloaded, skipping loading
	I1128 11:45:41.074041   23619 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I1128 11:45:41.095592   23619 cni.go:95] Creating CNI manager for ""
	I1128 11:45:41.095606   23619 cni.go:169] CNI unnecessary in this configuration, recommending no CNI
	I1128 11:45:41.095621   23619 kubeadm.go:87] Using pod CIDR: 10.244.0.0/16
	I1128 11:45:41.095634   23619 kubeadm.go:158] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.64.71 APIServerPort:8443 KubernetesVersion:v1.25.3 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:pause-114428 NodeName:pause-114428 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.64.71"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.64.71 CgroupDriver:systemd ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kuber
netes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[]}
	I1128 11:45:41.095723   23619 kubeadm.go:163] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.64.71
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: /var/run/cri-dockerd.sock
	  name: "pause-114428"
	  kubeletExtraArgs:
	    node-ip: 192.168.64.71
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.64.71"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.25.3
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: systemd
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1128 11:45:41.095780   23619 kubeadm.go:962] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.25.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --container-runtime=remote --container-runtime-endpoint=/var/run/cri-dockerd.sock --hostname-override=pause-114428 --image-service-endpoint=/var/run/cri-dockerd.sock --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.64.71 --runtime-request-timeout=15m
	
	[Install]
	 config:
	{KubernetesVersion:v1.25.3 ClusterName:pause-114428 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:}
	I1128 11:45:41.095850   23619 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.25.3
	I1128 11:45:41.102466   23619 binaries.go:44] Found k8s binaries, skipping transfer
	I1128 11:45:41.102519   23619 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1128 11:45:41.108944   23619 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (475 bytes)
	I1128 11:45:41.120388   23619 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I1128 11:45:41.132346   23619 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2037 bytes)
	I1128 11:45:41.143831   23619 ssh_runner.go:195] Run: grep 192.168.64.71	control-plane.minikube.internal$ /etc/hosts
	I1128 11:45:41.146202   23619 certs.go:54] Setting up /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/pause-114428 for IP: 192.168.64.71
	I1128 11:45:41.146323   23619 certs.go:182] skipping minikubeCA CA generation: /Users/jenkins/minikube-integration/15411-14646/.minikube/ca.key
	I1128 11:45:41.146389   23619 certs.go:182] skipping proxyClientCA CA generation: /Users/jenkins/minikube-integration/15411-14646/.minikube/proxy-client-ca.key
	I1128 11:45:41.146477   23619 certs.go:298] skipping minikube-user signed cert generation: /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/pause-114428/client.key
	I1128 11:45:41.146554   23619 certs.go:298] skipping minikube signed cert generation: /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/pause-114428/apiserver.key.a47a93ce
	I1128 11:45:41.146616   23619 certs.go:298] skipping aggregator signed cert generation: /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/pause-114428/proxy-client.key
	I1128 11:45:41.146847   23619 certs.go:388] found cert: /Users/jenkins/minikube-integration/15411-14646/.minikube/certs/Users/jenkins/minikube-integration/15411-14646/.minikube/certs/15823.pem (1338 bytes)
	W1128 11:45:41.146891   23619 certs.go:384] ignoring /Users/jenkins/minikube-integration/15411-14646/.minikube/certs/Users/jenkins/minikube-integration/15411-14646/.minikube/certs/15823_empty.pem, impossibly tiny 0 bytes
	I1128 11:45:41.146903   23619 certs.go:388] found cert: /Users/jenkins/minikube-integration/15411-14646/.minikube/certs/Users/jenkins/minikube-integration/15411-14646/.minikube/certs/ca-key.pem (1675 bytes)
	I1128 11:45:41.146937   23619 certs.go:388] found cert: /Users/jenkins/minikube-integration/15411-14646/.minikube/certs/Users/jenkins/minikube-integration/15411-14646/.minikube/certs/ca.pem (1078 bytes)
	I1128 11:45:41.146975   23619 certs.go:388] found cert: /Users/jenkins/minikube-integration/15411-14646/.minikube/certs/Users/jenkins/minikube-integration/15411-14646/.minikube/certs/cert.pem (1123 bytes)
	I1128 11:45:41.147009   23619 certs.go:388] found cert: /Users/jenkins/minikube-integration/15411-14646/.minikube/certs/Users/jenkins/minikube-integration/15411-14646/.minikube/certs/key.pem (1679 bytes)
	I1128 11:45:41.147085   23619 certs.go:388] found cert: /Users/jenkins/minikube-integration/15411-14646/.minikube/files/etc/ssl/certs/Users/jenkins/minikube-integration/15411-14646/.minikube/files/etc/ssl/certs/158232.pem (1708 bytes)
	I1128 11:45:41.147607   23619 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/pause-114428/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1399 bytes)
	I1128 11:45:41.166618   23619 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/pause-114428/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1128 11:45:41.184362   23619 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/pause-114428/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1128 11:45:41.202419   23619 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/pause-114428/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1128 11:45:41.222192   23619 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/15411-14646/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1128 11:45:41.242459   23619 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/15411-14646/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I1128 11:45:41.258500   23619 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/15411-14646/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1128 11:45:41.273967   23619 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/15411-14646/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1128 11:45:41.289588   23619 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/15411-14646/.minikube/files/etc/ssl/certs/158232.pem --> /usr/share/ca-certificates/158232.pem (1708 bytes)
	I1128 11:45:41.305392   23619 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/15411-14646/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1128 11:45:41.320973   23619 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/15411-14646/.minikube/certs/15823.pem --> /usr/share/ca-certificates/15823.pem (1338 bytes)
	I1128 11:45:41.336275   23619 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1128 11:45:41.347456   23619 ssh_runner.go:195] Run: openssl version
	I1128 11:45:41.351002   23619 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15823.pem && ln -fs /usr/share/ca-certificates/15823.pem /etc/ssl/certs/15823.pem"
	I1128 11:45:41.358368   23619 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15823.pem
	I1128 11:45:41.361909   23619 certs.go:431] hashing: -rw-r--r-- 1 root root 1338 Nov 28 18:51 /usr/share/ca-certificates/15823.pem
	I1128 11:45:41.361967   23619 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15823.pem
	I1128 11:45:41.365456   23619 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/15823.pem /etc/ssl/certs/51391683.0"
	I1128 11:45:41.371853   23619 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/158232.pem && ln -fs /usr/share/ca-certificates/158232.pem /etc/ssl/certs/158232.pem"
	I1128 11:45:41.379009   23619 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/158232.pem
	I1128 11:45:41.381908   23619 certs.go:431] hashing: -rw-r--r-- 1 root root 1708 Nov 28 18:51 /usr/share/ca-certificates/158232.pem
	I1128 11:45:41.381948   23619 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/158232.pem
	I1128 11:45:41.385519   23619 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/158232.pem /etc/ssl/certs/3ec20f2e.0"
	I1128 11:45:41.392030   23619 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1128 11:45:41.399249   23619 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1128 11:45:41.402156   23619 certs.go:431] hashing: -rw-r--r-- 1 root root 1111 Nov 28 18:44 /usr/share/ca-certificates/minikubeCA.pem
	I1128 11:45:41.402200   23619 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1128 11:45:41.405718   23619 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1128 11:45:41.412137   23619 kubeadm.go:396] StartCluster: {Name:pause-114428 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/15235/minikube-v1.28.0-1668700269-15235-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.36-1668787669-15272@sha256:06094fc04b5dc02fbf1e2de7723c2a6db5d24c21fd2ddda91f6daaf29038cd9c Memory:2048 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion
:v1.25.3 ClusterName:pause-114428 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP:192.168.64.71 Port:8443 KubernetesVersion:v1.25.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] Mou
ntPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath:/opt/socket_vmnet/bin/socket_vmnet_client SocketVMnetPath:/var/run/socket_vmnet}
	I1128 11:45:41.412236   23619 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I1128 11:45:41.427776   23619 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1128 11:45:41.434228   23619 kubeadm.go:411] found existing configuration files, will attempt cluster restart
	I1128 11:45:41.434248   23619 kubeadm.go:627] restartCluster start
	I1128 11:45:41.434302   23619 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1128 11:45:41.440658   23619 kubeadm.go:127] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1128 11:45:41.441039   23619 kubeconfig.go:92] found "pause-114428" server: "https://192.168.64.71:8443"
	I1128 11:45:41.441594   23619 kapi.go:59] client config for pause-114428: &rest.Config{Host:"https://192.168.64.71:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/pause-114428/client.crt", KeyFile:"/Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/pause-114428/client.key", CAFile:"/Users/jenkins/minikube-integration/15411-14646/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]
string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x2356c80), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1128 11:45:41.442085   23619 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1128 11:45:41.448205   23619 api_server.go:165] Checking apiserver status ...
	I1128 11:45:41.448255   23619 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1128 11:45:41.456352   23619 api_server.go:169] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1128 11:45:41.658281   23619 api_server.go:165] Checking apiserver status ...
	I1128 11:45:41.658362   23619 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1128 11:45:41.667403   23619 api_server.go:169] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1128 11:45:41.857490   23619 api_server.go:165] Checking apiserver status ...
	I1128 11:45:41.857546   23619 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1128 11:45:41.866174   23619 api_server.go:169] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1128 11:45:42.058030   23619 api_server.go:165] Checking apiserver status ...
	I1128 11:45:42.058132   23619 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1128 11:45:42.067579   23619 api_server.go:169] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1128 11:45:42.257617   23619 api_server.go:165] Checking apiserver status ...
	I1128 11:45:42.257689   23619 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1128 11:45:42.268280   23619 api_server.go:169] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1128 11:45:42.457037   23619 api_server.go:165] Checking apiserver status ...
	I1128 11:45:42.457113   23619 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1128 11:45:42.465852   23619 api_server.go:169] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1128 11:45:42.657523   23619 api_server.go:165] Checking apiserver status ...
	I1128 11:45:42.657600   23619 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1128 11:45:42.666068   23619 api_server.go:169] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1128 11:45:42.858443   23619 api_server.go:165] Checking apiserver status ...
	I1128 11:45:42.858500   23619 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1128 11:45:42.866990   23619 api_server.go:169] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1128 11:45:43.057569   23619 api_server.go:165] Checking apiserver status ...
	I1128 11:45:43.057646   23619 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1128 11:45:43.066376   23619 api_server.go:169] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1128 11:45:43.258459   23619 api_server.go:165] Checking apiserver status ...
	I1128 11:45:43.258541   23619 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1128 11:45:43.267173   23619 api_server.go:169] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1128 11:45:43.456591   23619 api_server.go:165] Checking apiserver status ...
	I1128 11:45:43.456669   23619 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1128 11:45:43.465532   23619 api_server.go:169] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1128 11:45:43.657025   23619 api_server.go:165] Checking apiserver status ...
	I1128 11:45:43.657103   23619 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1128 11:45:43.665910   23619 api_server.go:169] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1128 11:45:43.856512   23619 api_server.go:165] Checking apiserver status ...
	I1128 11:45:43.856590   23619 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1128 11:45:43.865026   23619 api_server.go:169] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1128 11:45:44.057753   23619 api_server.go:165] Checking apiserver status ...
	I1128 11:45:44.057869   23619 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1128 11:45:44.066055   23619 api_server.go:169] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1128 11:45:44.256580   23619 api_server.go:165] Checking apiserver status ...
	I1128 11:45:44.256664   23619 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1128 11:45:44.273129   23619 api_server.go:169] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1128 11:45:44.458085   23619 api_server.go:165] Checking apiserver status ...
	I1128 11:45:44.458184   23619 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1128 11:45:44.473448   23619 api_server.go:169] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1128 11:45:44.473460   23619 api_server.go:165] Checking apiserver status ...
	I1128 11:45:44.473519   23619 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W1128 11:45:44.527909   23619 api_server.go:169] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I1128 11:45:44.527922   23619 kubeadm.go:602] needs reconfigure: apiserver error: timed out waiting for the condition
	I1128 11:45:44.527931   23619 kubeadm.go:1114] stopping kube-system containers ...
	I1128 11:45:44.528013   23619 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I1128 11:45:44.618405   23619 docker.go:444] Stopping containers: [35df9602af76 8e79d957b3d6 035a9c0a4833 40d2c766a5f4 eca5b163acee dc03b9f8fae5 36bb77fb0f5b dbe772b0925a af6c90d021e5 7cf378f4872e c2404a1daca8 cf3c8b0b6388 aaa7b0e997b9 677c0d9be193 d3c7e3e4259d 81e0e82e398a 71edc30d3dbb ff0da6531377 b3e57a29b398 84a4fabebcee b9fda07d774c 85f17aaa7130 65c3593bd95f d727bfd02878 7a46cf868e7b]
	I1128 11:45:44.618532   23619 ssh_runner.go:195] Run: docker stop 35df9602af76 8e79d957b3d6 035a9c0a4833 40d2c766a5f4 eca5b163acee dc03b9f8fae5 36bb77fb0f5b dbe772b0925a af6c90d021e5 7cf378f4872e c2404a1daca8 cf3c8b0b6388 aaa7b0e997b9 677c0d9be193 d3c7e3e4259d 81e0e82e398a 71edc30d3dbb ff0da6531377 b3e57a29b398 84a4fabebcee b9fda07d774c 85f17aaa7130 65c3593bd95f d727bfd02878 7a46cf868e7b
	I1128 11:45:45.421215   23619 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1128 11:45:45.478253   23619 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1128 11:45:45.493122   23619 kubeadm.go:155] found existing configuration files:
	-rw------- 1 root root 5643 Nov 28 19:44 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5657 Nov 28 19:44 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 1987 Nov 28 19:45 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5605 Nov 28 19:44 /etc/kubernetes/scheduler.conf
	
	I1128 11:45:45.493194   23619 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I1128 11:45:45.509592   23619 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I1128 11:45:45.522416   23619 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I1128 11:45:45.534014   23619 kubeadm.go:166] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1128 11:45:45.534081   23619 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1128 11:45:45.551905   23619 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I1128 11:45:45.564983   23619 kubeadm.go:166] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1128 11:45:45.565065   23619 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1128 11:45:45.580748   23619 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1128 11:45:45.595840   23619 kubeadm.go:704] reconfiguring cluster from /var/tmp/minikube/kubeadm.yaml
	I1128 11:45:45.595854   23619 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.3:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1128 11:45:45.660616   23619 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.3:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1128 11:45:46.137320   23619 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.3:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1128 11:45:46.279374   23619 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.3:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1128 11:45:46.333519   23619 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.3:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1128 11:45:46.393169   23619 api_server.go:51] waiting for apiserver process to appear ...
	I1128 11:45:46.393253   23619 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1128 11:45:46.907161   23619 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1128 11:45:47.406454   23619 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1128 11:45:47.906511   23619 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1128 11:45:48.406126   23619 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1128 11:45:48.415996   23619 api_server.go:71] duration metric: took 2.022787598s to wait for apiserver process to appear ...
	I1128 11:45:48.416011   23619 api_server.go:87] waiting for apiserver healthz status ...
	I1128 11:45:48.416026   23619 api_server.go:252] Checking apiserver healthz at https://192.168.64.71:8443/healthz ...
	I1128 11:45:51.336183   23619 api_server.go:278] https://192.168.64.71:8443/healthz returned 403:
	{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/healthz\"","reason":"Forbidden","details":{},"code":403}
	W1128 11:45:51.336201   23619 api_server.go:102] status: https://192.168.64.71:8443/healthz returned error 403:
	{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/healthz\"","reason":"Forbidden","details":{},"code":403}
	I1128 11:45:51.837655   23619 api_server.go:252] Checking apiserver healthz at https://192.168.64.71:8443/healthz ...
	I1128 11:45:51.841933   23619 api_server.go:278] https://192.168.64.71:8443/healthz returned 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/bootstrap-controller ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	W1128 11:45:51.841949   23619 api_server.go:102] status: https://192.168.64.71:8443/healthz returned error 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/bootstrap-controller ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	I1128 11:45:52.336934   23619 api_server.go:252] Checking apiserver healthz at https://192.168.64.71:8443/healthz ...
	I1128 11:45:52.343081   23619 api_server.go:278] https://192.168.64.71:8443/healthz returned 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/bootstrap-controller ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	W1128 11:45:52.343096   23619 api_server.go:102] status: https://192.168.64.71:8443/healthz returned error 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/bootstrap-controller ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	I1128 11:45:52.836986   23619 api_server.go:252] Checking apiserver healthz at https://192.168.64.71:8443/healthz ...
	I1128 11:45:52.841884   23619 api_server.go:278] https://192.168.64.71:8443/healthz returned 200:
	ok
	I1128 11:45:52.847607   23619 api_server.go:140] control plane version: v1.25.3
	I1128 11:45:52.847622   23619 api_server.go:130] duration metric: took 4.431511234s to wait for apiserver health ...
	I1128 11:45:52.847628   23619 cni.go:95] Creating CNI manager for ""
	I1128 11:45:52.847633   23619 cni.go:169] CNI unnecessary in this configuration, recommending no CNI
	I1128 11:45:52.847645   23619 system_pods.go:43] waiting for kube-system pods to appear ...
	I1128 11:45:52.856415   23619 system_pods.go:59] 6 kube-system pods found
	I1128 11:45:52.856431   23619 system_pods.go:61] "coredns-565d847f94-hstbw" [21d18ebd-7324-4815-857d-aa2cea270e10] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I1128 11:45:52.856437   23619 system_pods.go:61] "etcd-pause-114428" [e76c9996-d2c9-4c3e-bdd1-3b9a162b20d5] Running / Ready:ContainersNotReady (containers with unready status: [etcd]) / ContainersReady:ContainersNotReady (containers with unready status: [etcd])
	I1128 11:45:52.856441   23619 system_pods.go:61] "kube-apiserver-pause-114428" [ef055aee-e95a-4698-b207-8039ce03eefd] Running / Ready:ContainersNotReady (containers with unready status: [kube-apiserver]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-apiserver])
	I1128 11:45:52.856446   23619 system_pods.go:61] "kube-controller-manager-pause-114428" [fb607845-660c-4078-937b-aa7f3841460e] Running / Ready:ContainersNotReady (containers with unready status: [kube-controller-manager]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-controller-manager])
	I1128 11:45:52.856449   23619 system_pods.go:61] "kube-proxy-gm2kh" [5f48a047-d53f-4630-beec-4846d0327f1f] Running
	I1128 11:45:52.856453   23619 system_pods.go:61] "kube-scheduler-pause-114428" [d0a7e59a-952f-43b7-aef1-7f00b12e962a] Running / Ready:ContainersNotReady (containers with unready status: [kube-scheduler]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-scheduler])
	I1128 11:45:52.856457   23619 system_pods.go:74] duration metric: took 8.807578ms to wait for pod list to return data ...
	I1128 11:45:52.856462   23619 node_conditions.go:102] verifying NodePressure condition ...
	I1128 11:45:52.859255   23619 node_conditions.go:122] node storage ephemeral capacity is 17784752Ki
	I1128 11:45:52.859273   23619 node_conditions.go:123] node cpu capacity is 2
	I1128 11:45:52.859283   23619 node_conditions.go:105] duration metric: took 2.817884ms to run NodePressure ...
	I1128 11:45:52.859294   23619 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.3:$PATH" kubeadm init phase addon all --config /var/tmp/minikube/kubeadm.yaml"
	I1128 11:45:53.017463   23619 kubeadm.go:763] waiting for restarted kubelet to initialise ...
	I1128 11:45:53.019985   23619 kubeadm.go:778] kubelet initialised
	I1128 11:45:53.019996   23619 kubeadm.go:779] duration metric: took 2.518825ms waiting for restarted kubelet to initialise ...
	I1128 11:45:53.020004   23619 pod_ready.go:35] extra waiting up to 4m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I1128 11:45:53.023130   23619 pod_ready.go:78] waiting up to 4m0s for pod "coredns-565d847f94-hstbw" in "kube-system" namespace to be "Ready" ...
	I1128 11:45:55.030954   23619 pod_ready.go:102] pod "coredns-565d847f94-hstbw" in "kube-system" namespace has status "Ready":"False"
	I1128 11:45:55.531849   23619 pod_ready.go:92] pod "coredns-565d847f94-hstbw" in "kube-system" namespace has status "Ready":"True"
	I1128 11:45:55.531861   23619 pod_ready.go:81] duration metric: took 2.508666905s waiting for pod "coredns-565d847f94-hstbw" in "kube-system" namespace to be "Ready" ...
	I1128 11:45:55.531868   23619 pod_ready.go:78] waiting up to 4m0s for pod "etcd-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:45:57.538943   23619 pod_ready.go:102] pod "etcd-pause-114428" in "kube-system" namespace has status "Ready":"False"
	I1128 11:45:58.038339   23619 pod_ready.go:92] pod "etcd-pause-114428" in "kube-system" namespace has status "Ready":"True"
	I1128 11:45:58.038353   23619 pod_ready.go:81] duration metric: took 2.506427283s waiting for pod "etcd-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:45:58.038359   23619 pod_ready.go:78] waiting up to 4m0s for pod "kube-apiserver-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:00.045962   23619 pod_ready.go:102] pod "kube-apiserver-pause-114428" in "kube-system" namespace has status "Ready":"False"
	I1128 11:46:02.545073   23619 pod_ready.go:102] pod "kube-apiserver-pause-114428" in "kube-system" namespace has status "Ready":"False"
	I1128 11:46:04.547145   23619 pod_ready.go:102] pod "kube-apiserver-pause-114428" in "kube-system" namespace has status "Ready":"False"
	I1128 11:46:06.048083   23619 pod_ready.go:92] pod "kube-apiserver-pause-114428" in "kube-system" namespace has status "Ready":"True"
	I1128 11:46:06.048097   23619 pod_ready.go:81] duration metric: took 8.009560784s waiting for pod "kube-apiserver-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:06.048103   23619 pod_ready.go:78] waiting up to 4m0s for pod "kube-controller-manager-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:06.051216   23619 pod_ready.go:92] pod "kube-controller-manager-pause-114428" in "kube-system" namespace has status "Ready":"True"
	I1128 11:46:06.051224   23619 pod_ready.go:81] duration metric: took 3.116595ms waiting for pod "kube-controller-manager-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:06.051230   23619 pod_ready.go:78] waiting up to 4m0s for pod "kube-proxy-gm2kh" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:06.053957   23619 pod_ready.go:92] pod "kube-proxy-gm2kh" in "kube-system" namespace has status "Ready":"True"
	I1128 11:46:06.053964   23619 pod_ready.go:81] duration metric: took 2.729921ms waiting for pod "kube-proxy-gm2kh" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:06.053969   23619 pod_ready.go:78] waiting up to 4m0s for pod "kube-scheduler-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:06.562996   23619 pod_ready.go:92] pod "kube-scheduler-pause-114428" in "kube-system" namespace has status "Ready":"True"
	I1128 11:46:06.563009   23619 pod_ready.go:81] duration metric: took 509.018441ms waiting for pod "kube-scheduler-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:06.563015   23619 pod_ready.go:38] duration metric: took 13.542714077s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I1128 11:46:06.563028   23619 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I1128 11:46:06.570229   23619 ops.go:34] apiserver oom_adj: -16
	I1128 11:46:06.570238   23619 kubeadm.go:631] restartCluster took 25.135447778s
	I1128 11:46:06.570243   23619 kubeadm.go:398] StartCluster complete in 25.15757485s
	I1128 11:46:06.570252   23619 settings.go:142] acquiring lock: {Name:mk7392d5e25d999df834aabe5db592398d1f845f Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1128 11:46:06.570337   23619 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/15411-14646/kubeconfig
	I1128 11:46:06.570709   23619 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/15411-14646/kubeconfig: {Name:mk58f8fa3810393ea66460d5cf44fc66020c4987 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1128 11:46:06.571302   23619 kapi.go:59] client config for pause-114428: &rest.Config{Host:"https://192.168.64.71:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/pause-114428/client.crt", KeyFile:"/Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/pause-114428/client.key", CAFile:"/Users/jenkins/minikube-integration/15411-14646/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]
string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x2356c80), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1128 11:46:06.573288   23619 kapi.go:244] deployment "coredns" in namespace "kube-system" and context "pause-114428" rescaled to 1
	I1128 11:46:06.573316   23619 start.go:212] Will wait 6m0s for node &{Name: IP:192.168.64.71 Port:8443 KubernetesVersion:v1.25.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I1128 11:46:06.573325   23619 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.25.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I1128 11:46:06.595199   23619 out.go:177] * Verifying Kubernetes components...
	I1128 11:46:06.573354   23619 addons.go:486] enableAddons start: toEnable=map[], additional=[]
	I1128 11:46:06.573498   23619 config.go:180] Loaded profile config "pause-114428": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.3
	I1128 11:46:06.629592   23619 start.go:806] CoreDNS already contains "host.minikube.internal" host record, skipping...
	I1128 11:46:06.637368   23619 addons.go:65] Setting default-storageclass=true in profile "pause-114428"
	I1128 11:46:06.637368   23619 addons.go:65] Setting storage-provisioner=true in profile "pause-114428"
	I1128 11:46:06.637416   23619 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "pause-114428"
	I1128 11:46:06.637436   23619 addons.go:227] Setting addon storage-provisioner=true in "pause-114428"
	I1128 11:46:06.637382   23619 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	W1128 11:46:06.637448   23619 addons.go:236] addon storage-provisioner should already be in state true
	I1128 11:46:06.637510   23619 host.go:66] Checking if "pause-114428" exists ...
	I1128 11:46:06.637835   23619 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:46:06.637860   23619 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 11:46:06.637905   23619 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:46:06.637925   23619 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 11:46:06.645886   23619 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58739
	I1128 11:46:06.646195   23619 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58741
	I1128 11:46:06.646301   23619 main.go:134] libmachine: () Calling .GetVersion
	I1128 11:46:06.646561   23619 main.go:134] libmachine: () Calling .GetVersion
	I1128 11:46:06.646698   23619 main.go:134] libmachine: Using API Version  1
	I1128 11:46:06.646713   23619 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 11:46:06.646891   23619 main.go:134] libmachine: Using API Version  1
	I1128 11:46:06.646902   23619 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 11:46:06.646912   23619 main.go:134] libmachine: () Calling .GetMachineName
	I1128 11:46:06.647097   23619 main.go:134] libmachine: () Calling .GetMachineName
	I1128 11:46:06.647214   23619 main.go:134] libmachine: (pause-114428) Calling .GetState
	I1128 11:46:06.647287   23619 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:46:06.647304   23619 main.go:134] libmachine: (pause-114428) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:46:06.647310   23619 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 11:46:06.647389   23619 main.go:134] libmachine: (pause-114428) DBG | hyperkit pid from json: 23529
	I1128 11:46:06.648684   23619 node_ready.go:35] waiting up to 6m0s for node "pause-114428" to be "Ready" ...
	I1128 11:46:06.649127   23619 kapi.go:59] client config for pause-114428: &rest.Config{Host:"https://192.168.64.71:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/pause-114428/client.crt", KeyFile:"/Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/pause-114428/client.key", CAFile:"/Users/jenkins/minikube-integration/15411-14646/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]
string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x2356c80), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1128 11:46:06.650857   23619 node_ready.go:49] node "pause-114428" has status "Ready":"True"
	I1128 11:46:06.650866   23619 node_ready.go:38] duration metric: took 2.162932ms waiting for node "pause-114428" to be "Ready" ...
	I1128 11:46:06.650871   23619 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I1128 11:46:06.651663   23619 addons.go:227] Setting addon default-storageclass=true in "pause-114428"
	W1128 11:46:06.651672   23619 addons.go:236] addon default-storageclass should already be in state true
	I1128 11:46:06.651690   23619 host.go:66] Checking if "pause-114428" exists ...
	I1128 11:46:06.651949   23619 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:46:06.651975   23619 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 11:46:06.654611   23619 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58743
	I1128 11:46:06.654971   23619 main.go:134] libmachine: () Calling .GetVersion
	I1128 11:46:06.655291   23619 main.go:134] libmachine: Using API Version  1
	I1128 11:46:06.655304   23619 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 11:46:06.655510   23619 main.go:134] libmachine: () Calling .GetMachineName
	I1128 11:46:06.655602   23619 pod_ready.go:78] waiting up to 6m0s for pod "coredns-565d847f94-hstbw" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:06.655605   23619 main.go:134] libmachine: (pause-114428) Calling .GetState
	I1128 11:46:06.655686   23619 main.go:134] libmachine: (pause-114428) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:46:06.655766   23619 main.go:134] libmachine: (pause-114428) DBG | hyperkit pid from json: 23529
	I1128 11:46:06.656669   23619 main.go:134] libmachine: (pause-114428) Calling .DriverName
	I1128 11:46:06.659108   23619 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58745
	I1128 11:46:06.678307   23619 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1128 11:46:06.679132   23619 main.go:134] libmachine: () Calling .GetVersion
	I1128 11:46:06.699526   23619 addons.go:419] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1128 11:46:06.699546   23619 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1128 11:46:06.699568   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHHostname
	I1128 11:46:06.699800   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHPort
	I1128 11:46:06.700032   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHKeyPath
	I1128 11:46:06.700176   23619 main.go:134] libmachine: Using API Version  1
	I1128 11:46:06.700207   23619 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 11:46:06.700229   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHUsername
	I1128 11:46:06.700424   23619 sshutil.go:53] new ssh client: &{IP:192.168.64.71 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/pause-114428/id_rsa Username:docker}
	I1128 11:46:06.700648   23619 main.go:134] libmachine: () Calling .GetMachineName
	I1128 11:46:06.701359   23619 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:46:06.701397   23619 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 11:46:06.709163   23619 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58748
	I1128 11:46:06.709518   23619 main.go:134] libmachine: () Calling .GetVersion
	I1128 11:46:06.709864   23619 main.go:134] libmachine: Using API Version  1
	I1128 11:46:06.709878   23619 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 11:46:06.710074   23619 main.go:134] libmachine: () Calling .GetMachineName
	I1128 11:46:06.710166   23619 main.go:134] libmachine: (pause-114428) Calling .GetState
	I1128 11:46:06.710244   23619 main.go:134] libmachine: (pause-114428) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:46:06.710318   23619 main.go:134] libmachine: (pause-114428) DBG | hyperkit pid from json: 23529
	I1128 11:46:06.711180   23619 main.go:134] libmachine: (pause-114428) Calling .DriverName
	I1128 11:46:06.711324   23619 addons.go:419] installing /etc/kubernetes/addons/storageclass.yaml
	I1128 11:46:06.711332   23619 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1128 11:46:06.711341   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHHostname
	I1128 11:46:06.711421   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHPort
	I1128 11:46:06.711501   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHKeyPath
	I1128 11:46:06.711570   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHUsername
	I1128 11:46:06.711656   23619 sshutil.go:53] new ssh client: &{IP:192.168.64.71 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/pause-114428/id_rsa Username:docker}
	I1128 11:46:06.752029   23619 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1128 11:46:06.761901   23619 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.3/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1128 11:46:06.844170   23619 pod_ready.go:92] pod "coredns-565d847f94-hstbw" in "kube-system" namespace has status "Ready":"True"
	I1128 11:46:06.844181   23619 pod_ready.go:81] duration metric: took 188.565794ms waiting for pod "coredns-565d847f94-hstbw" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:06.844189   23619 pod_ready.go:78] waiting up to 6m0s for pod "etcd-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:07.245650   23619 pod_ready.go:92] pod "etcd-pause-114428" in "kube-system" namespace has status "Ready":"True"
	I1128 11:46:07.245661   23619 pod_ready.go:81] duration metric: took 401.459797ms waiting for pod "etcd-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:07.245670   23619 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:07.358355   23619 main.go:134] libmachine: Making call to close driver server
	I1128 11:46:07.358369   23619 main.go:134] libmachine: (pause-114428) Calling .Close
	I1128 11:46:07.358466   23619 main.go:134] libmachine: Making call to close driver server
	I1128 11:46:07.358476   23619 main.go:134] libmachine: (pause-114428) Calling .Close
	I1128 11:46:07.358539   23619 main.go:134] libmachine: Successfully made call to close driver server
	I1128 11:46:07.358554   23619 main.go:134] libmachine: Making call to close connection to plugin binary
	I1128 11:46:07.358561   23619 main.go:134] libmachine: Making call to close driver server
	I1128 11:46:07.358569   23619 main.go:134] libmachine: (pause-114428) Calling .Close
	I1128 11:46:07.358578   23619 main.go:134] libmachine: (pause-114428) DBG | Closing plugin on server side
	I1128 11:46:07.358605   23619 main.go:134] libmachine: Successfully made call to close driver server
	I1128 11:46:07.358613   23619 main.go:134] libmachine: Making call to close connection to plugin binary
	I1128 11:46:07.358622   23619 main.go:134] libmachine: Making call to close driver server
	I1128 11:46:07.358628   23619 main.go:134] libmachine: (pause-114428) Calling .Close
	I1128 11:46:07.358710   23619 main.go:134] libmachine: Successfully made call to close driver server
	I1128 11:46:07.358722   23619 main.go:134] libmachine: Making call to close connection to plugin binary
	I1128 11:46:07.358722   23619 main.go:134] libmachine: (pause-114428) DBG | Closing plugin on server side
	I1128 11:46:07.358738   23619 main.go:134] libmachine: Making call to close driver server
	I1128 11:46:07.358754   23619 main.go:134] libmachine: (pause-114428) Calling .Close
	I1128 11:46:07.358877   23619 main.go:134] libmachine: (pause-114428) DBG | Closing plugin on server side
	I1128 11:46:07.358891   23619 main.go:134] libmachine: Successfully made call to close driver server
	I1128 11:46:07.358932   23619 main.go:134] libmachine: Making call to close connection to plugin binary
	I1128 11:46:07.358981   23619 main.go:134] libmachine: Successfully made call to close driver server
	I1128 11:46:07.359000   23619 main.go:134] libmachine: Making call to close connection to plugin binary
	I1128 11:46:07.359083   23619 main.go:134] libmachine: (pause-114428) DBG | Closing plugin on server side
	I1128 11:46:07.398686   23619 out.go:177] * Enabled addons: storage-provisioner, default-storageclass
	I1128 11:46:07.457752   23619 addons.go:488] enableAddons completed in 884.379177ms
	I1128 11:46:07.646777   23619 pod_ready.go:92] pod "kube-apiserver-pause-114428" in "kube-system" namespace has status "Ready":"True"
	I1128 11:46:07.646790   23619 pod_ready.go:81] duration metric: took 401.1063ms waiting for pod "kube-apiserver-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:07.646798   23619 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:08.043898   23619 pod_ready.go:92] pod "kube-controller-manager-pause-114428" in "kube-system" namespace has status "Ready":"True"
	I1128 11:46:08.043909   23619 pod_ready.go:81] duration metric: took 397.096432ms waiting for pod "kube-controller-manager-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:08.043915   23619 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-gm2kh" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:08.445110   23619 pod_ready.go:92] pod "kube-proxy-gm2kh" in "kube-system" namespace has status "Ready":"True"
	I1128 11:46:08.445121   23619 pod_ready.go:81] duration metric: took 401.192773ms waiting for pod "kube-proxy-gm2kh" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:08.445128   23619 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:08.844356   23619 pod_ready.go:92] pod "kube-scheduler-pause-114428" in "kube-system" namespace has status "Ready":"True"
	I1128 11:46:08.844366   23619 pod_ready.go:81] duration metric: took 399.225652ms waiting for pod "kube-scheduler-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:08.844372   23619 pod_ready.go:38] duration metric: took 2.193441179s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I1128 11:46:08.844385   23619 api_server.go:51] waiting for apiserver process to appear ...
	I1128 11:46:08.844445   23619 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1128 11:46:08.853453   23619 api_server.go:71] duration metric: took 2.280070722s to wait for apiserver process to appear ...
	I1128 11:46:08.853468   23619 api_server.go:87] waiting for apiserver healthz status ...
	I1128 11:46:08.853477   23619 api_server.go:252] Checking apiserver healthz at https://192.168.64.71:8443/healthz ...
	I1128 11:46:08.857396   23619 api_server.go:278] https://192.168.64.71:8443/healthz returned 200:
	ok
	I1128 11:46:08.857918   23619 api_server.go:140] control plane version: v1.25.3
	I1128 11:46:08.857929   23619 api_server.go:130] duration metric: took 4.455033ms to wait for apiserver health ...
	I1128 11:46:08.857934   23619 system_pods.go:43] waiting for kube-system pods to appear ...
	I1128 11:46:09.045441   23619 system_pods.go:59] 7 kube-system pods found
	I1128 11:46:09.045455   23619 system_pods.go:61] "coredns-565d847f94-hstbw" [21d18ebd-7324-4815-857d-aa2cea270e10] Running
	I1128 11:46:09.045459   23619 system_pods.go:61] "etcd-pause-114428" [e76c9996-d2c9-4c3e-bdd1-3b9a162b20d5] Running
	I1128 11:46:09.045463   23619 system_pods.go:61] "kube-apiserver-pause-114428" [ef055aee-e95a-4698-b207-8039ce03eefd] Running
	I1128 11:46:09.045466   23619 system_pods.go:61] "kube-controller-manager-pause-114428" [fb607845-660c-4078-937b-aa7f3841460e] Running
	I1128 11:46:09.045470   23619 system_pods.go:61] "kube-proxy-gm2kh" [5f48a047-d53f-4630-beec-4846d0327f1f] Running
	I1128 11:46:09.045473   23619 system_pods.go:61] "kube-scheduler-pause-114428" [d0a7e59a-952f-43b7-aef1-7f00b12e962a] Running
	I1128 11:46:09.045478   23619 system_pods.go:61] "storage-provisioner" [f3bb3ce4-fe67-4d5b-9a95-048c18b13469] Pending / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I1128 11:46:09.045482   23619 system_pods.go:74] duration metric: took 187.540087ms to wait for pod list to return data ...
	I1128 11:46:09.045488   23619 default_sa.go:34] waiting for default service account to be created ...
	I1128 11:46:09.243309   23619 default_sa.go:45] found service account: "default"
	I1128 11:46:09.243320   23619 default_sa.go:55] duration metric: took 197.823748ms for default service account to be created ...
	I1128 11:46:09.243326   23619 system_pods.go:116] waiting for k8s-apps to be running ...
	I1128 11:46:09.445692   23619 system_pods.go:86] 7 kube-system pods found
	I1128 11:46:09.445706   23619 system_pods.go:89] "coredns-565d847f94-hstbw" [21d18ebd-7324-4815-857d-aa2cea270e10] Running
	I1128 11:46:09.445711   23619 system_pods.go:89] "etcd-pause-114428" [e76c9996-d2c9-4c3e-bdd1-3b9a162b20d5] Running
	I1128 11:46:09.445714   23619 system_pods.go:89] "kube-apiserver-pause-114428" [ef055aee-e95a-4698-b207-8039ce03eefd] Running
	I1128 11:46:09.445718   23619 system_pods.go:89] "kube-controller-manager-pause-114428" [fb607845-660c-4078-937b-aa7f3841460e] Running
	I1128 11:46:09.445721   23619 system_pods.go:89] "kube-proxy-gm2kh" [5f48a047-d53f-4630-beec-4846d0327f1f] Running
	I1128 11:46:09.445725   23619 system_pods.go:89] "kube-scheduler-pause-114428" [d0a7e59a-952f-43b7-aef1-7f00b12e962a] Running
	I1128 11:46:09.445729   23619 system_pods.go:89] "storage-provisioner" [f3bb3ce4-fe67-4d5b-9a95-048c18b13469] Running
	I1128 11:46:09.445733   23619 system_pods.go:126] duration metric: took 202.39991ms to wait for k8s-apps to be running ...
	I1128 11:46:09.445739   23619 system_svc.go:44] waiting for kubelet service to be running ....
	I1128 11:46:09.445802   23619 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1128 11:46:09.455773   23619 system_svc.go:56] duration metric: took 10.02855ms WaitForService to wait for kubelet.
	I1128 11:46:09.455787   23619 kubeadm.go:573] duration metric: took 2.882394245s to wait for : map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] ...
	I1128 11:46:09.455798   23619 node_conditions.go:102] verifying NodePressure condition ...
	I1128 11:46:09.649312   23619 node_conditions.go:122] node storage ephemeral capacity is 17784752Ki
	I1128 11:46:09.649327   23619 node_conditions.go:123] node cpu capacity is 2
	I1128 11:46:09.649353   23619 node_conditions.go:105] duration metric: took 193.543357ms to run NodePressure ...
	I1128 11:46:09.649377   23619 start.go:217] waiting for startup goroutines ...
	I1128 11:46:09.649843   23619 ssh_runner.go:195] Run: rm -f paused
	I1128 11:46:09.693369   23619 start.go:535] kubectl: 1.25.2, cluster: 1.25.3 (minor skew: 0)
	I1128 11:46:09.719383   23619 out.go:177] * Done! kubectl is now configured to use "pause-114428" cluster and "default" namespace by default

                                                
                                                
** /stderr **
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p pause-114428 -n pause-114428
helpers_test.go:244: <<< TestPause/serial/SecondStartNoReconfiguration FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestPause/serial/SecondStartNoReconfiguration]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-darwin-amd64 -p pause-114428 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-darwin-amd64 -p pause-114428 logs -n 25: (3.432771041s)
helpers_test.go:252: TestPause/serial/SecondStartNoReconfiguration logs: 
-- stdout --
	* 
	* ==> Audit <==
	* |---------|--------------------------------|---------------------------|---------|---------|---------------------|---------------------|
	| Command |              Args              |          Profile          |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------|---------------------------|---------|---------|---------------------|---------------------|
	| start   | -p kubernetes-upgrade-113819   | kubernetes-upgrade-113819 | jenkins | v1.28.0 | 28 Nov 22 11:39 PST | 28 Nov 22 11:40 PST |
	|         | --memory=2200                  |                           |         |         |                     |                     |
	|         | --kubernetes-version=v1.25.3   |                           |         |         |                     |                     |
	|         | --alsologtostderr -v=1         |                           |         |         |                     |                     |
	|         | --driver=hyperkit              |                           |         |         |                     |                     |
	| start   | -p kubernetes-upgrade-113819   | kubernetes-upgrade-113819 | jenkins | v1.28.0 | 28 Nov 22 11:40 PST |                     |
	|         | --memory=2200                  |                           |         |         |                     |                     |
	|         | --kubernetes-version=v1.16.0   |                           |         |         |                     |                     |
	|         | --driver=hyperkit              |                           |         |         |                     |                     |
	| start   | -p kubernetes-upgrade-113819   | kubernetes-upgrade-113819 | jenkins | v1.28.0 | 28 Nov 22 11:40 PST | 28 Nov 22 11:41 PST |
	|         | --memory=2200                  |                           |         |         |                     |                     |
	|         | --kubernetes-version=v1.25.3   |                           |         |         |                     |                     |
	|         | --alsologtostderr -v=1         |                           |         |         |                     |                     |
	|         | --driver=hyperkit              |                           |         |         |                     |                     |
	| delete  | -p kubernetes-upgrade-113819   | kubernetes-upgrade-113819 | jenkins | v1.28.0 | 28 Nov 22 11:41 PST | 28 Nov 22 11:41 PST |
	| start   | -p cert-expiration-113723      | cert-expiration-113723    | jenkins | v1.28.0 | 28 Nov 22 11:41 PST | 28 Nov 22 11:41 PST |
	|         | --memory=2048                  |                           |         |         |                     |                     |
	|         | --cert-expiration=8760h        |                           |         |         |                     |                     |
	|         | --driver=hyperkit              |                           |         |         |                     |                     |
	| delete  | -p cert-expiration-113723      | cert-expiration-113723    | jenkins | v1.28.0 | 28 Nov 22 11:41 PST | 28 Nov 22 11:41 PST |
	| start   | -p stopped-upgrade-114107      | stopped-upgrade-114107    | jenkins | v1.28.0 | 28 Nov 22 11:43 PST | 28 Nov 22 11:44 PST |
	|         | --memory=2200                  |                           |         |         |                     |                     |
	|         | --alsologtostderr -v=1         |                           |         |         |                     |                     |
	|         | --driver=hyperkit              |                           |         |         |                     |                     |
	| start   | -p running-upgrade-114136      | running-upgrade-114136    | jenkins | v1.28.0 | 28 Nov 22 11:43 PST | 28 Nov 22 11:44 PST |
	|         | --memory=2200                  |                           |         |         |                     |                     |
	|         | --alsologtostderr -v=1         |                           |         |         |                     |                     |
	|         | --driver=hyperkit              |                           |         |         |                     |                     |
	| delete  | -p stopped-upgrade-114107      | stopped-upgrade-114107    | jenkins | v1.28.0 | 28 Nov 22 11:44 PST | 28 Nov 22 11:44 PST |
	| start   | -p NoKubernetes-114420         | NoKubernetes-114420       | jenkins | v1.28.0 | 28 Nov 22 11:44 PST |                     |
	|         | --no-kubernetes                |                           |         |         |                     |                     |
	|         | --kubernetes-version=1.20      |                           |         |         |                     |                     |
	|         | --driver=hyperkit              |                           |         |         |                     |                     |
	| start   | -p NoKubernetes-114420         | NoKubernetes-114420       | jenkins | v1.28.0 | 28 Nov 22 11:44 PST | 28 Nov 22 11:45 PST |
	|         | --driver=hyperkit              |                           |         |         |                     |                     |
	| delete  | -p running-upgrade-114136      | running-upgrade-114136    | jenkins | v1.28.0 | 28 Nov 22 11:44 PST | 28 Nov 22 11:44 PST |
	| start   | -p pause-114428 --memory=2048  | pause-114428              | jenkins | v1.28.0 | 28 Nov 22 11:44 PST | 28 Nov 22 11:45 PST |
	|         | --install-addons=false         |                           |         |         |                     |                     |
	|         | --wait=all --driver=hyperkit   |                           |         |         |                     |                     |
	| start   | -p NoKubernetes-114420         | NoKubernetes-114420       | jenkins | v1.28.0 | 28 Nov 22 11:45 PST | 28 Nov 22 11:45 PST |
	|         | --no-kubernetes                |                           |         |         |                     |                     |
	|         | --driver=hyperkit              |                           |         |         |                     |                     |
	| delete  | -p NoKubernetes-114420         | NoKubernetes-114420       | jenkins | v1.28.0 | 28 Nov 22 11:45 PST | 28 Nov 22 11:45 PST |
	| start   | -p NoKubernetes-114420         | NoKubernetes-114420       | jenkins | v1.28.0 | 28 Nov 22 11:45 PST | 28 Nov 22 11:45 PST |
	|         | --no-kubernetes                |                           |         |         |                     |                     |
	|         | --driver=hyperkit              |                           |         |         |                     |                     |
	| start   | -p pause-114428                | pause-114428              | jenkins | v1.28.0 | 28 Nov 22 11:45 PST | 28 Nov 22 11:46 PST |
	|         | --alsologtostderr -v=1         |                           |         |         |                     |                     |
	|         | --driver=hyperkit              |                           |         |         |                     |                     |
	| ssh     | -p NoKubernetes-114420 sudo    | NoKubernetes-114420       | jenkins | v1.28.0 | 28 Nov 22 11:45 PST |                     |
	|         | systemctl is-active --quiet    |                           |         |         |                     |                     |
	|         | service kubelet                |                           |         |         |                     |                     |
	| profile | list                           | minikube                  | jenkins | v1.28.0 | 28 Nov 22 11:45 PST | 28 Nov 22 11:45 PST |
	| profile | list --output=json             | minikube                  | jenkins | v1.28.0 | 28 Nov 22 11:45 PST | 28 Nov 22 11:45 PST |
	| stop    | -p NoKubernetes-114420         | NoKubernetes-114420       | jenkins | v1.28.0 | 28 Nov 22 11:45 PST | 28 Nov 22 11:45 PST |
	| start   | -p NoKubernetes-114420         | NoKubernetes-114420       | jenkins | v1.28.0 | 28 Nov 22 11:45 PST | 28 Nov 22 11:45 PST |
	|         | --driver=hyperkit              |                           |         |         |                     |                     |
	| ssh     | -p NoKubernetes-114420 sudo    | NoKubernetes-114420       | jenkins | v1.28.0 | 28 Nov 22 11:45 PST |                     |
	|         | systemctl is-active --quiet    |                           |         |         |                     |                     |
	|         | service kubelet                |                           |         |         |                     |                     |
	| delete  | -p NoKubernetes-114420         | NoKubernetes-114420       | jenkins | v1.28.0 | 28 Nov 22 11:45 PST | 28 Nov 22 11:45 PST |
	| start   | -p auto-113537 --memory=2048   | auto-113537               | jenkins | v1.28.0 | 28 Nov 22 11:45 PST |                     |
	|         | --alsologtostderr              |                           |         |         |                     |                     |
	|         | --wait=true --wait-timeout=5m  |                           |         |         |                     |                     |
	|         | --driver=hyperkit              |                           |         |         |                     |                     |
	|---------|--------------------------------|---------------------------|---------|---------|---------------------|---------------------|
	
	* 
	* ==> Last Start <==
	* Log file created at: 2022/11/28 11:45:56
	Running on machine: MacOS-Agent-2
	Binary: Built with gc go1.19.3 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1128 11:45:56.335795   23695 out.go:296] Setting OutFile to fd 1 ...
	I1128 11:45:56.335982   23695 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1128 11:45:56.335987   23695 out.go:309] Setting ErrFile to fd 2...
	I1128 11:45:56.335991   23695 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1128 11:45:56.336102   23695 root.go:334] Updating PATH: /Users/jenkins/minikube-integration/15411-14646/.minikube/bin
	I1128 11:45:56.336655   23695 out.go:303] Setting JSON to false
	I1128 11:45:56.355529   23695 start.go:124] hostinfo: {"hostname":"MacOS-Agent-2.local","uptime":9931,"bootTime":1669654825,"procs":398,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"13.0.1","kernelVersion":"22.1.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"2965c349-98a5-5970-aaa9-9eedd3ae5959"}
	W1128 11:45:56.355630   23695 start.go:132] gopshost.Virtualization returned error: not implemented yet
	I1128 11:45:56.377508   23695 out.go:177] * [auto-113537] minikube v1.28.0 on Darwin 13.0.1
	I1128 11:45:56.399253   23695 notify.go:220] Checking for updates...
	I1128 11:45:56.399295   23695 out.go:177]   - MINIKUBE_LOCATION=15411
	I1128 11:45:56.421341   23695 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/15411-14646/kubeconfig
	I1128 11:45:56.443238   23695 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I1128 11:45:56.464325   23695 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1128 11:45:56.507037   23695 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/15411-14646/.minikube
	I1128 11:45:56.544554   23695 config.go:180] Loaded profile config "pause-114428": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.3
	I1128 11:45:56.544601   23695 driver.go:365] Setting default libvirt URI to qemu:///system
	I1128 11:45:56.573415   23695 out.go:177] * Using the hyperkit driver based on user configuration
	I1128 11:45:56.615268   23695 start.go:293] selected driver: hyperkit
	I1128 11:45:56.615328   23695 start.go:837] validating driver "hyperkit" against <nil>
	I1128 11:45:56.615366   23695 start.go:848] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1128 11:45:56.619138   23695 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1128 11:45:56.619304   23695 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/15411-14646/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I1128 11:45:56.626281   23695 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.28.0
	I1128 11:45:56.629580   23695 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:45:56.629598   23695 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I1128 11:45:56.629636   23695 start_flags.go:303] no existing cluster config was found, will generate one from the flags 
	I1128 11:45:56.629793   23695 start_flags.go:910] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1128 11:45:56.629820   23695 cni.go:95] Creating CNI manager for ""
	I1128 11:45:56.629829   23695 cni.go:169] CNI unnecessary in this configuration, recommending no CNI
	I1128 11:45:56.629837   23695 start_flags.go:317] config:
	{Name:auto-113537 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.36-1668787669-15272@sha256:06094fc04b5dc02fbf1e2de7723c2a6db5d24c21fd2ddda91f6daaf29038cd9c Memory:2048 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.25.3 ClusterName:auto-113537 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRI
Socket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:5m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath:/opt/socket_vmnet/bin/socket_vmnet_client SocketVMnetPath:/var/run/socket_vmnet}
	I1128 11:45:56.629944   23695 iso.go:125] acquiring lock: {Name:mkf8786ebc65c7c4a918cffd312ffffda2a4bd0b Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1128 11:45:56.672004   23695 out.go:177] * Starting control plane node auto-113537 in cluster auto-113537
	I1128 11:45:56.711339   23695 preload.go:132] Checking if preload exists for k8s version v1.25.3 and runtime docker
	I1128 11:45:56.711457   23695 preload.go:148] Found local preload: /Users/jenkins/minikube-integration/15411-14646/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.25.3-docker-overlay2-amd64.tar.lz4
	I1128 11:45:56.711496   23695 cache.go:57] Caching tarball of preloaded images
	I1128 11:45:56.711745   23695 preload.go:174] Found /Users/jenkins/minikube-integration/15411-14646/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.25.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I1128 11:45:56.711773   23695 cache.go:60] Finished verifying existence of preloaded tar for  v1.25.3 on docker
	I1128 11:45:56.711928   23695 profile.go:148] Saving config to /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/auto-113537/config.json ...
	I1128 11:45:56.711982   23695 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/auto-113537/config.json: {Name:mk7f26573ec22434b9a6aa9a5cdee227059dc03f Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1128 11:45:56.712562   23695 cache.go:208] Successfully downloaded all kic artifacts
	I1128 11:45:56.712614   23695 start.go:364] acquiring machines lock for auto-113537: {Name:mk027eaad0dbb84f6e95336dab244cf2d7aaac44 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I1128 11:45:56.712714   23695 start.go:368] acquired machines lock for "auto-113537" in 85.681µs
	I1128 11:45:56.712758   23695 start.go:93] Provisioning new machine with config: &{Name:auto-113537 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/15235/minikube-v1.28.0-1668700269-15235-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.36-1668787669-15272@sha256:06094fc04b5dc02fbf1e2de7723c2a6db5d24c21fd2ddda91f6daaf29038cd9c Memory:2048 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConf
ig:{KubernetesVersion:v1.25.3 ClusterName:auto-113537 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.25.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:5m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions
:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath:/opt/socket_vmnet/bin/socket_vmnet_client SocketVMnetPath:/var/run/socket_vmnet} &{Name: IP: Port:8443 KubernetesVersion:v1.25.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I1128 11:45:56.712867   23695 start.go:125] createHost starting for "" (driver="hyperkit")
	I1128 11:45:55.030954   23619 pod_ready.go:102] pod "coredns-565d847f94-hstbw" in "kube-system" namespace has status "Ready":"False"
	I1128 11:45:55.531849   23619 pod_ready.go:92] pod "coredns-565d847f94-hstbw" in "kube-system" namespace has status "Ready":"True"
	I1128 11:45:55.531861   23619 pod_ready.go:81] duration metric: took 2.508666905s waiting for pod "coredns-565d847f94-hstbw" in "kube-system" namespace to be "Ready" ...
	I1128 11:45:55.531868   23619 pod_ready.go:78] waiting up to 4m0s for pod "etcd-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:45:57.538943   23619 pod_ready.go:102] pod "etcd-pause-114428" in "kube-system" namespace has status "Ready":"False"
	I1128 11:45:58.038339   23619 pod_ready.go:92] pod "etcd-pause-114428" in "kube-system" namespace has status "Ready":"True"
	I1128 11:45:58.038353   23619 pod_ready.go:81] duration metric: took 2.506427283s waiting for pod "etcd-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:45:58.038359   23619 pod_ready.go:78] waiting up to 4m0s for pod "kube-apiserver-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:45:56.750239   23695 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	I1128 11:45:56.750744   23695 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:45:56.750817   23695 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 11:45:56.758738   23695 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58736
	I1128 11:45:56.759133   23695 main.go:134] libmachine: () Calling .GetVersion
	I1128 11:45:56.759558   23695 main.go:134] libmachine: Using API Version  1
	I1128 11:45:56.759569   23695 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 11:45:56.759778   23695 main.go:134] libmachine: () Calling .GetMachineName
	I1128 11:45:56.759865   23695 main.go:134] libmachine: (auto-113537) Calling .GetMachineName
	I1128 11:45:56.759947   23695 main.go:134] libmachine: (auto-113537) Calling .DriverName
	I1128 11:45:56.760046   23695 start.go:159] libmachine.API.Create for "auto-113537" (driver="hyperkit")
	I1128 11:45:56.760071   23695 client.go:168] LocalClient.Create starting
	I1128 11:45:56.760114   23695 main.go:134] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/15411-14646/.minikube/certs/ca.pem
	I1128 11:45:56.760161   23695 main.go:134] libmachine: Decoding PEM data...
	I1128 11:45:56.760177   23695 main.go:134] libmachine: Parsing certificate...
	I1128 11:45:56.760239   23695 main.go:134] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/15411-14646/.minikube/certs/cert.pem
	I1128 11:45:56.760271   23695 main.go:134] libmachine: Decoding PEM data...
	I1128 11:45:56.760283   23695 main.go:134] libmachine: Parsing certificate...
	I1128 11:45:56.760299   23695 main.go:134] libmachine: Running pre-create checks...
	I1128 11:45:56.760306   23695 main.go:134] libmachine: (auto-113537) Calling .PreCreateCheck
	I1128 11:45:56.760378   23695 main.go:134] libmachine: (auto-113537) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:45:56.760573   23695 main.go:134] libmachine: (auto-113537) Calling .GetConfigRaw
	I1128 11:45:56.761021   23695 main.go:134] libmachine: Creating machine...
	I1128 11:45:56.761030   23695 main.go:134] libmachine: (auto-113537) Calling .Create
	I1128 11:45:56.761106   23695 main.go:134] libmachine: (auto-113537) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:45:56.761231   23695 main.go:134] libmachine: (auto-113537) DBG | I1128 11:45:56.761096   23703 common.go:116] Making disk image using store path: /Users/jenkins/minikube-integration/15411-14646/.minikube
	I1128 11:45:56.761285   23695 main.go:134] libmachine: (auto-113537) Downloading /Users/jenkins/minikube-integration/15411-14646/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/15411-14646/.minikube/cache/iso/amd64/minikube-v1.28.0-1668700269-15235-amd64.iso...
	I1128 11:45:56.911471   23695 main.go:134] libmachine: (auto-113537) DBG | I1128 11:45:56.911404   23703 common.go:123] Creating ssh key: /Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/id_rsa...
	I1128 11:45:57.006986   23695 main.go:134] libmachine: (auto-113537) DBG | I1128 11:45:57.006900   23703 common.go:129] Creating raw disk image: /Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/auto-113537.rawdisk...
	I1128 11:45:57.007008   23695 main.go:134] libmachine: (auto-113537) DBG | Writing magic tar header
	I1128 11:45:57.007017   23695 main.go:134] libmachine: (auto-113537) DBG | Writing SSH key tar header
	I1128 11:45:57.007972   23695 main.go:134] libmachine: (auto-113537) DBG | I1128 11:45:57.007927   23703 common.go:143] Fixing permissions on /Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537 ...
	I1128 11:45:57.155343   23695 main.go:134] libmachine: (auto-113537) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:45:57.155360   23695 main.go:134] libmachine: (auto-113537) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/hyperkit.pid
	I1128 11:45:57.155400   23695 main.go:134] libmachine: (auto-113537) DBG | Using UUID 46b4ad8a-6f55-11ed-a6ea-f01898ef957c
	I1128 11:45:57.183655   23695 main.go:134] libmachine: (auto-113537) DBG | Generated MAC b6:f5:64:15:38:f8
	I1128 11:45:57.183679   23695 main.go:134] libmachine: (auto-113537) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=auto-113537
	I1128 11:45:57.183717   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"46b4ad8a-6f55-11ed-a6ea-f01898ef957c", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc000206ed0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/bzimage", Initrd:"/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I1128 11:45:57.183757   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"46b4ad8a-6f55-11ed-a6ea-f01898ef957c", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc000206ed0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/bzimage", Initrd:"/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I1128 11:45:57.183813   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/hyperkit.pid", "-c", "2", "-m", "2048M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "46b4ad8a-6f55-11ed-a6ea-f01898ef957c", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/auto-113537.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/tty,log=/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/bzimage,/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-11
3537/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=auto-113537"}
	I1128 11:45:57.183862   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/hyperkit.pid -c 2 -m 2048M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 46b4ad8a-6f55-11ed-a6ea-f01898ef957c -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/auto-113537.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/tty,log=/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/console-ring -f kexec,/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/bzimage,/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/initrd,earlyprintk=serial loglevel=3 console=ttyS0 consol
e=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=auto-113537"
	I1128 11:45:57.183878   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I1128 11:45:57.185452   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 DEBUG: hyperkit: Pid is 23706
	I1128 11:45:57.185821   23695 main.go:134] libmachine: (auto-113537) DBG | Attempt 0
	I1128 11:45:57.185840   23695 main.go:134] libmachine: (auto-113537) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:45:57.185931   23695 main.go:134] libmachine: (auto-113537) DBG | hyperkit pid from json: 23706
	I1128 11:45:57.187669   23695 main.go:134] libmachine: (auto-113537) DBG | Searching for b6:f5:64:15:38:f8 in /var/db/dhcpd_leases ...
	I1128 11:45:57.187786   23695 main.go:134] libmachine: (auto-113537) DBG | Found 71 entries in /var/db/dhcpd_leases!
	I1128 11:45:57.187797   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.72 HWAddress:52:8e:82:73:2d:5d ID:1,52:8e:82:73:2d:5d Lease:0x63850ff3}
	I1128 11:45:57.187821   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.71 HWAddress:82:25:bb:13:22:fb ID:1,82:25:bb:13:22:fb Lease:0x63866129}
	I1128 11:45:57.187828   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.70 HWAddress:fa:3e:bb:ec:c:fa ID:1,fa:3e:bb:ec:c:fa Lease:0x63850fce}
	I1128 11:45:57.187835   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.69 HWAddress:1a:4b:c7:db:c5:27 ID:1,1a:4b:c7:db:c5:27 Lease:0x63866083}
	I1128 11:45:57.187845   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.68 HWAddress:6a:d5:2a:9d:e3:ec ID:1,6a:d5:2a:9d:e3:ec Lease:0x638660e0}
	I1128 11:45:57.187855   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.67 HWAddress:d2:b6:13:e9:ec:fb ID:1,d2:b6:13:e9:ec:fb Lease:0x63850ed1}
	I1128 11:45:57.187866   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.66 HWAddress:4e:65:48:b0:42:f7 ID:1,4e:65:48:b0:42:f7 Lease:0x63865f87}
	I1128 11:45:57.187874   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.65 HWAddress:a:c0:d2:86:d1:de ID:1,a:c0:d2:86:d1:de Lease:0x63865f7c}
	I1128 11:45:57.187881   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.64 HWAddress:a2:40:8b:72:14:1e ID:1,a2:40:8b:72:14:1e Lease:0x63850dfd}
	I1128 11:45:57.187901   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.63 HWAddress:e6:49:9d:db:e8:47 ID:1,e6:49:9d:db:e8:47 Lease:0x63850df2}
	I1128 11:45:57.187919   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.62 HWAddress:32:8b:8b:9c:c:60 ID:1,32:8b:8b:9c:c:60 Lease:0x63850dc6}
	I1128 11:45:57.187934   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.61 HWAddress:36:fb:4d:2c:8:50 ID:1,36:fb:4d:2c:8:50 Lease:0x63865f12}
	I1128 11:45:57.187950   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.60 HWAddress:76:3a:56:4a:cd:30 ID:1,76:3a:56:4a:cd:30 Lease:0x63865ec4}
	I1128 11:45:57.187961   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.59 HWAddress:72:47:b5:ba:26:2d ID:1,72:47:b5:ba:26:2d Lease:0x63850d1d}
	I1128 11:45:57.187970   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.58 HWAddress:c6:fa:2c:41:9:47 ID:1,c6:fa:2c:41:9:47 Lease:0x63865dc0}
	I1128 11:45:57.187980   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.57 HWAddress:9e:67:b5:77:d3:1d ID:1,9e:67:b5:77:d3:1d Lease:0x63865d8c}
	I1128 11:45:57.187991   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.56 HWAddress:86:cc:24:51:6b:6e ID:1,86:cc:24:51:6b:6e Lease:0x638509d0}
	I1128 11:45:57.188002   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.55 HWAddress:42:36:92:b8:85:fa ID:1,42:36:92:b8:85:fa Lease:0x63850c35}
	I1128 11:45:57.188012   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.54 HWAddress:f2:f1:f4:12:c2:7c ID:1,f2:f1:f4:12:c2:7c Lease:0x63850c33}
	I1128 11:45:57.188020   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.53 HWAddress:12:89:36:a9:8d:3d ID:1,12:89:36:a9:8d:3d Lease:0x6385057f}
	I1128 11:45:57.188028   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.52 HWAddress:b6:93:42:e7:68:e2 ID:1,b6:93:42:e7:68:e2 Lease:0x63850569}
	I1128 11:45:57.188037   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.51 HWAddress:52:78:bf:d2:77:fd ID:1,52:78:bf:d2:77:fd Lease:0x63850540}
	I1128 11:45:57.188056   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.50 HWAddress:8a:c3:5c:7e:b1:4d ID:1,8a:c3:5c:7e:b1:4d Lease:0x63850544}
	I1128 11:45:57.188071   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.49 HWAddress:b6:44:15:6e:57:97 ID:1,b6:44:15:6e:57:97 Lease:0x63865622}
	I1128 11:45:57.188080   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.48 HWAddress:f2:23:ae:44:4:29 ID:1,f2:23:ae:44:4:29 Lease:0x638655a2}
	I1128 11:45:57.188090   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.47 HWAddress:16:4b:33:ee:29:c1 ID:1,16:4b:33:ee:29:c1 Lease:0x6386549d}
	I1128 11:45:57.188098   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.46 HWAddress:62:45:cd:91:58:e8 ID:1,62:45:cd:91:58:e8 Lease:0x63865465}
	I1128 11:45:57.188105   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.45 HWAddress:f2:6:3c:74:fc:db ID:1,f2:6:3c:74:fc:db Lease:0x638502db}
	I1128 11:45:57.188120   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.44 HWAddress:2a:90:7d:b1:f7:11 ID:1,2a:90:7d:b1:f7:11 Lease:0x6386517b}
	I1128 11:45:57.188134   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.43 HWAddress:4e:57:a2:36:3:67 ID:1,4e:57:a2:36:3:67 Lease:0x63865046}
	I1128 11:45:57.188154   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.42 HWAddress:26:2f:63:2b:d4:a5 ID:1,26:2f:63:2b:d4:a5 Lease:0x63864fe8}
	I1128 11:45:57.188173   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.41 HWAddress:5a:14:d7:4:7:2 ID:1,5a:14:d7:4:7:2 Lease:0x63864e46}
	I1128 11:45:57.188189   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.40 HWAddress:fa:d7:14:9c:bb:c0 ID:1,fa:d7:14:9c:bb:c0 Lease:0x63864e20}
	I1128 11:45:57.188213   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.39 HWAddress:f2:29:bb:45:2e:e2 ID:1,f2:29:bb:45:2e:e2 Lease:0x63864d0d}
	I1128 11:45:57.188230   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.38 HWAddress:ca:69:15:23:d2:a5 ID:1,ca:69:15:23:d2:a5 Lease:0x63864cc3}
	I1128 11:45:57.188243   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.37 HWAddress:5e:51:f3:a2:6:5f ID:1,5e:51:f3:a2:6:5f Lease:0x6384fbed}
	I1128 11:45:57.188252   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.36 HWAddress:aa:58:3a:d5:65:33 ID:1,aa:58:3a:d5:65:33 Lease:0x63864c67}
	I1128 11:45:57.188260   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.35 HWAddress:b2:f2:ea:1f:74:e ID:1,b2:f2:ea:1f:74:e Lease:0x6384fade}
	I1128 11:45:57.188272   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.34 HWAddress:36:f:68:69:52:3d ID:1,36:f:68:69:52:3d Lease:0x63864bca}
	I1128 11:45:57.188283   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.33 HWAddress:e:f3:df:c3:c2:b9 ID:1,e:f3:df:c3:c2:b9 Lease:0x6384fb39}
	I1128 11:45:57.188295   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.32 HWAddress:e2:67:c6:ab:e7:ea ID:1,e2:67:c6:ab:e7:ea Lease:0x6384fa41}
	I1128 11:45:57.188306   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.31 HWAddress:1e:a5:be:ca:98:56 ID:1,1e:a5:be:ca:98:56 Lease:0x63864b0c}
	I1128 11:45:57.188315   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.30 HWAddress:5a:ae:ab:90:bf:2b ID:1,5a:ae:ab:90:bf:2b Lease:0x6384f9c9}
	I1128 11:45:57.188324   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.29 HWAddress:1e:25:1a:be:8d:fd ID:1,1e:25:1a:be:8d:fd Lease:0x6384f977}
	I1128 11:45:57.188332   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.28 HWAddress:e6:9a:10:9a:9d:57 ID:1,e6:9a:10:9a:9d:57 Lease:0x6384f94d}
	I1128 11:45:57.188352   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.27 HWAddress:62:93:5e:a6:9d:67 ID:1,62:93:5e:a6:9d:67 Lease:0x63864a75}
	I1128 11:45:57.188365   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.26 HWAddress:b6:e3:fd:79:5c:f2 ID:1,b6:e3:fd:79:5c:f2 Lease:0x63864a69}
	I1128 11:45:57.188375   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.25 HWAddress:ae:1d:54:5d:a8:a9 ID:1,ae:1d:54:5d:a8:a9 Lease:0x6384f8eb}
	I1128 11:45:57.188383   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.24 HWAddress:ee:75:ae:dd:5:e ID:1,ee:75:ae:dd:5:e Lease:0x6386493d}
	I1128 11:45:57.188401   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.23 HWAddress:f6:ed:f4:c4:fa:ab ID:1,f6:ed:f4:c4:fa:ab Lease:0x63864903}
	I1128 11:45:57.188422   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.22 HWAddress:26:24:eb:25:b4:5e ID:1,26:24:eb:25:b4:5e Lease:0x638648e0}
	I1128 11:45:57.188436   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.21 HWAddress:96:d3:21:59:8c:6e ID:1,96:d3:21:59:8c:6e Lease:0x638648d2}
	I1128 11:45:57.188445   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.20 HWAddress:d6:50:28:3:34:52 ID:1,d6:50:28:3:34:52 Lease:0x6384f751}
	I1128 11:45:57.188453   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.19 HWAddress:b2:8c:dc:88:c8:6f ID:1,b2:8c:dc:88:c8:6f Lease:0x6386487f}
	I1128 11:45:57.188464   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:4a:6:df:57:8c:60 ID:1,4a:6:df:57:8c:60 Lease:0x6386486f}
	I1128 11:45:57.188473   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:ca:dd:8e:48:25:9b ID:1,ca:dd:8e:48:25:9b Lease:0x6384f6e4}
	I1128 11:45:57.188489   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:3e:ab:e9:af:c5:1d ID:1,3e:ab:e9:af:c5:1d Lease:0x6384f67b}
	I1128 11:45:57.188501   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:4e:29:d6:1a:69:c6 ID:1,4e:29:d6:1a:69:c6 Lease:0x63864724}
	I1128 11:45:57.188508   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:42:bd:4c:e7:cd:a6 ID:1,42:bd:4c:e7:cd:a6 Lease:0x6385001b}
	I1128 11:45:57.188517   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:4e:39:2d:ba:b9:1f ID:1,4e:39:2d:ba:b9:1f Lease:0x6384f355}
	I1128 11:45:57.188524   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:aa:18:b6:fc:5:66 ID:1,aa:18:b6:fc:5:66 Lease:0x6384f59a}
	I1128 11:45:57.188533   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:a2:0:cf:d4:a4:1e ID:1,a2:0:cf:d4:a4:1e Lease:0x6384f598}
	I1128 11:45:57.188540   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:a6:b4:46:e:86:bb ID:1,a6:b4:46:e:86:bb Lease:0x6384ee86}
	I1128 11:45:57.188548   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:ca:e9:48:5e:17:79 ID:1,ca:e9:48:5e:17:79 Lease:0x6384ee70}
	I1128 11:45:57.188556   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:aa:9e:45:37:85:ba ID:1,aa:9e:45:37:85:ba Lease:0x6384ee4a}
	I1128 11:45:57.188564   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:a:8d:65:60:cd:d8 ID:1,a:8d:65:60:cd:d8 Lease:0x63863f7b}
	I1128 11:45:57.188572   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:c6:57:55:bb:25:56 ID:1,c6:57:55:bb:25:56 Lease:0x63863f36}
	I1128 11:45:57.188580   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:6:81:84:1b:41:e8 ID:1,6:81:84:1b:41:e8 Lease:0x63863ec5}
	I1128 11:45:57.188601   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:46:e9:d4:c0:3a:d3 ID:1,46:e9:d4:c0:3a:d3 Lease:0x63863dcb}
	I1128 11:45:57.188610   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:5e:da:9c:e4:19:67 ID:1,5e:da:9c:e4:19:67 Lease:0x6384ec41}
	I1128 11:45:57.188618   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:fe:7d:42:ad:6c:69 ID:1,fe:7d:42:ad:6c:69 Lease:0x6384ec0f}
	I1128 11:45:57.191973   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I1128 11:45:57.198964   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I1128 11:45:57.199627   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I1128 11:45:57.199645   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I1128 11:45:57.199675   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I1128 11:45:57.199689   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I1128 11:45:57.547255   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I1128 11:45:57.547271   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I1128 11:45:57.651305   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I1128 11:45:57.651326   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I1128 11:45:57.651345   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I1128 11:45:57.651361   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I1128 11:45:57.652178   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I1128 11:45:57.652186   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I1128 11:45:59.189654   23695 main.go:134] libmachine: (auto-113537) DBG | Attempt 1
	I1128 11:45:59.189670   23695 main.go:134] libmachine: (auto-113537) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:45:59.189730   23695 main.go:134] libmachine: (auto-113537) DBG | hyperkit pid from json: 23706
	I1128 11:45:59.190510   23695 main.go:134] libmachine: (auto-113537) DBG | Searching for b6:f5:64:15:38:f8 in /var/db/dhcpd_leases ...
	I1128 11:45:59.190633   23695 main.go:134] libmachine: (auto-113537) DBG | Found 71 entries in /var/db/dhcpd_leases!
	I1128 11:45:59.190642   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.72 HWAddress:52:8e:82:73:2d:5d ID:1,52:8e:82:73:2d:5d Lease:0x63850ff3}
	I1128 11:45:59.190652   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.71 HWAddress:82:25:bb:13:22:fb ID:1,82:25:bb:13:22:fb Lease:0x63866129}
	I1128 11:45:59.190658   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.70 HWAddress:fa:3e:bb:ec:c:fa ID:1,fa:3e:bb:ec:c:fa Lease:0x63850fce}
	I1128 11:45:59.190665   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.69 HWAddress:1a:4b:c7:db:c5:27 ID:1,1a:4b:c7:db:c5:27 Lease:0x63866083}
	I1128 11:45:59.190675   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.68 HWAddress:6a:d5:2a:9d:e3:ec ID:1,6a:d5:2a:9d:e3:ec Lease:0x638660e0}
	I1128 11:45:59.190682   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.67 HWAddress:d2:b6:13:e9:ec:fb ID:1,d2:b6:13:e9:ec:fb Lease:0x63850ed1}
	I1128 11:45:59.190690   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.66 HWAddress:4e:65:48:b0:42:f7 ID:1,4e:65:48:b0:42:f7 Lease:0x63865f87}
	I1128 11:45:59.190708   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.65 HWAddress:a:c0:d2:86:d1:de ID:1,a:c0:d2:86:d1:de Lease:0x63865f7c}
	I1128 11:45:59.190719   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.64 HWAddress:a2:40:8b:72:14:1e ID:1,a2:40:8b:72:14:1e Lease:0x63850dfd}
	I1128 11:45:59.190741   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.63 HWAddress:e6:49:9d:db:e8:47 ID:1,e6:49:9d:db:e8:47 Lease:0x63850df2}
	I1128 11:45:59.190752   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.62 HWAddress:32:8b:8b:9c:c:60 ID:1,32:8b:8b:9c:c:60 Lease:0x63850dc6}
	I1128 11:45:59.190759   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.61 HWAddress:36:fb:4d:2c:8:50 ID:1,36:fb:4d:2c:8:50 Lease:0x63865f12}
	I1128 11:45:59.190765   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.60 HWAddress:76:3a:56:4a:cd:30 ID:1,76:3a:56:4a:cd:30 Lease:0x63865ec4}
	I1128 11:45:59.190775   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.59 HWAddress:72:47:b5:ba:26:2d ID:1,72:47:b5:ba:26:2d Lease:0x63850d1d}
	I1128 11:45:59.190782   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.58 HWAddress:c6:fa:2c:41:9:47 ID:1,c6:fa:2c:41:9:47 Lease:0x63865dc0}
	I1128 11:45:59.190794   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.57 HWAddress:9e:67:b5:77:d3:1d ID:1,9e:67:b5:77:d3:1d Lease:0x63865d8c}
	I1128 11:45:59.190801   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.56 HWAddress:86:cc:24:51:6b:6e ID:1,86:cc:24:51:6b:6e Lease:0x638509d0}
	I1128 11:45:59.190811   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.55 HWAddress:42:36:92:b8:85:fa ID:1,42:36:92:b8:85:fa Lease:0x63850c35}
	I1128 11:45:59.190818   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.54 HWAddress:f2:f1:f4:12:c2:7c ID:1,f2:f1:f4:12:c2:7c Lease:0x63850c33}
	I1128 11:45:59.190826   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.53 HWAddress:12:89:36:a9:8d:3d ID:1,12:89:36:a9:8d:3d Lease:0x6385057f}
	I1128 11:45:59.190839   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.52 HWAddress:b6:93:42:e7:68:e2 ID:1,b6:93:42:e7:68:e2 Lease:0x63850569}
	I1128 11:45:59.190859   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.51 HWAddress:52:78:bf:d2:77:fd ID:1,52:78:bf:d2:77:fd Lease:0x63850540}
	I1128 11:45:59.190872   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.50 HWAddress:8a:c3:5c:7e:b1:4d ID:1,8a:c3:5c:7e:b1:4d Lease:0x63850544}
	I1128 11:45:59.190880   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.49 HWAddress:b6:44:15:6e:57:97 ID:1,b6:44:15:6e:57:97 Lease:0x63865622}
	I1128 11:45:59.190888   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.48 HWAddress:f2:23:ae:44:4:29 ID:1,f2:23:ae:44:4:29 Lease:0x638655a2}
	I1128 11:45:59.190898   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.47 HWAddress:16:4b:33:ee:29:c1 ID:1,16:4b:33:ee:29:c1 Lease:0x6386549d}
	I1128 11:45:59.190906   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.46 HWAddress:62:45:cd:91:58:e8 ID:1,62:45:cd:91:58:e8 Lease:0x63865465}
	I1128 11:45:59.190914   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.45 HWAddress:f2:6:3c:74:fc:db ID:1,f2:6:3c:74:fc:db Lease:0x638502db}
	I1128 11:45:59.190920   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.44 HWAddress:2a:90:7d:b1:f7:11 ID:1,2a:90:7d:b1:f7:11 Lease:0x6386517b}
	I1128 11:45:59.190927   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.43 HWAddress:4e:57:a2:36:3:67 ID:1,4e:57:a2:36:3:67 Lease:0x63865046}
	I1128 11:45:59.190937   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.42 HWAddress:26:2f:63:2b:d4:a5 ID:1,26:2f:63:2b:d4:a5 Lease:0x63864fe8}
	I1128 11:45:59.190944   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.41 HWAddress:5a:14:d7:4:7:2 ID:1,5a:14:d7:4:7:2 Lease:0x63864e46}
	I1128 11:45:59.190953   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.40 HWAddress:fa:d7:14:9c:bb:c0 ID:1,fa:d7:14:9c:bb:c0 Lease:0x63864e20}
	I1128 11:45:59.190960   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.39 HWAddress:f2:29:bb:45:2e:e2 ID:1,f2:29:bb:45:2e:e2 Lease:0x63864d0d}
	I1128 11:45:59.190969   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.38 HWAddress:ca:69:15:23:d2:a5 ID:1,ca:69:15:23:d2:a5 Lease:0x63864cc3}
	I1128 11:45:59.190976   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.37 HWAddress:5e:51:f3:a2:6:5f ID:1,5e:51:f3:a2:6:5f Lease:0x6384fbed}
	I1128 11:45:59.190984   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.36 HWAddress:aa:58:3a:d5:65:33 ID:1,aa:58:3a:d5:65:33 Lease:0x63864c67}
	I1128 11:45:59.190992   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.35 HWAddress:b2:f2:ea:1f:74:e ID:1,b2:f2:ea:1f:74:e Lease:0x6384fade}
	I1128 11:45:59.191003   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.34 HWAddress:36:f:68:69:52:3d ID:1,36:f:68:69:52:3d Lease:0x63864bca}
	I1128 11:45:59.191012   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.33 HWAddress:e:f3:df:c3:c2:b9 ID:1,e:f3:df:c3:c2:b9 Lease:0x6384fb39}
	I1128 11:45:59.191021   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.32 HWAddress:e2:67:c6:ab:e7:ea ID:1,e2:67:c6:ab:e7:ea Lease:0x6384fa41}
	I1128 11:45:59.191029   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.31 HWAddress:1e:a5:be:ca:98:56 ID:1,1e:a5:be:ca:98:56 Lease:0x63864b0c}
	I1128 11:45:59.191036   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.30 HWAddress:5a:ae:ab:90:bf:2b ID:1,5a:ae:ab:90:bf:2b Lease:0x6384f9c9}
	I1128 11:45:59.191051   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.29 HWAddress:1e:25:1a:be:8d:fd ID:1,1e:25:1a:be:8d:fd Lease:0x6384f977}
	I1128 11:45:59.191066   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.28 HWAddress:e6:9a:10:9a:9d:57 ID:1,e6:9a:10:9a:9d:57 Lease:0x6384f94d}
	I1128 11:45:59.191075   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.27 HWAddress:62:93:5e:a6:9d:67 ID:1,62:93:5e:a6:9d:67 Lease:0x63864a75}
	I1128 11:45:59.191083   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.26 HWAddress:b6:e3:fd:79:5c:f2 ID:1,b6:e3:fd:79:5c:f2 Lease:0x63864a69}
	I1128 11:45:59.191091   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.25 HWAddress:ae:1d:54:5d:a8:a9 ID:1,ae:1d:54:5d:a8:a9 Lease:0x6384f8eb}
	I1128 11:45:59.191098   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.24 HWAddress:ee:75:ae:dd:5:e ID:1,ee:75:ae:dd:5:e Lease:0x6386493d}
	I1128 11:45:59.191107   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.23 HWAddress:f6:ed:f4:c4:fa:ab ID:1,f6:ed:f4:c4:fa:ab Lease:0x63864903}
	I1128 11:45:59.191115   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.22 HWAddress:26:24:eb:25:b4:5e ID:1,26:24:eb:25:b4:5e Lease:0x638648e0}
	I1128 11:45:59.191126   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.21 HWAddress:96:d3:21:59:8c:6e ID:1,96:d3:21:59:8c:6e Lease:0x638648d2}
	I1128 11:45:59.191133   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.20 HWAddress:d6:50:28:3:34:52 ID:1,d6:50:28:3:34:52 Lease:0x6384f751}
	I1128 11:45:59.191140   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.19 HWAddress:b2:8c:dc:88:c8:6f ID:1,b2:8c:dc:88:c8:6f Lease:0x6386487f}
	I1128 11:45:59.191148   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:4a:6:df:57:8c:60 ID:1,4a:6:df:57:8c:60 Lease:0x6386486f}
	I1128 11:45:59.191156   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:ca:dd:8e:48:25:9b ID:1,ca:dd:8e:48:25:9b Lease:0x6384f6e4}
	I1128 11:45:59.191163   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:3e:ab:e9:af:c5:1d ID:1,3e:ab:e9:af:c5:1d Lease:0x6384f67b}
	I1128 11:45:59.191173   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:4e:29:d6:1a:69:c6 ID:1,4e:29:d6:1a:69:c6 Lease:0x63864724}
	I1128 11:45:59.191183   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:42:bd:4c:e7:cd:a6 ID:1,42:bd:4c:e7:cd:a6 Lease:0x6385001b}
	I1128 11:45:59.191195   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:4e:39:2d:ba:b9:1f ID:1,4e:39:2d:ba:b9:1f Lease:0x6384f355}
	I1128 11:45:59.191203   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:aa:18:b6:fc:5:66 ID:1,aa:18:b6:fc:5:66 Lease:0x6384f59a}
	I1128 11:45:59.191212   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:a2:0:cf:d4:a4:1e ID:1,a2:0:cf:d4:a4:1e Lease:0x6384f598}
	I1128 11:45:59.191219   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:a6:b4:46:e:86:bb ID:1,a6:b4:46:e:86:bb Lease:0x6384ee86}
	I1128 11:45:59.191228   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:ca:e9:48:5e:17:79 ID:1,ca:e9:48:5e:17:79 Lease:0x6384ee70}
	I1128 11:45:59.191253   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:aa:9e:45:37:85:ba ID:1,aa:9e:45:37:85:ba Lease:0x6384ee4a}
	I1128 11:45:59.191284   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:a:8d:65:60:cd:d8 ID:1,a:8d:65:60:cd:d8 Lease:0x63863f7b}
	I1128 11:45:59.191292   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:c6:57:55:bb:25:56 ID:1,c6:57:55:bb:25:56 Lease:0x63863f36}
	I1128 11:45:59.191305   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:6:81:84:1b:41:e8 ID:1,6:81:84:1b:41:e8 Lease:0x63863ec5}
	I1128 11:45:59.191314   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:46:e9:d4:c0:3a:d3 ID:1,46:e9:d4:c0:3a:d3 Lease:0x63863dcb}
	I1128 11:45:59.191325   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:5e:da:9c:e4:19:67 ID:1,5e:da:9c:e4:19:67 Lease:0x6384ec41}
	I1128 11:45:59.191335   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:fe:7d:42:ad:6c:69 ID:1,fe:7d:42:ad:6c:69 Lease:0x6384ec0f}
	I1128 11:46:01.191376   23695 main.go:134] libmachine: (auto-113537) DBG | Attempt 2
	I1128 11:46:01.191406   23695 main.go:134] libmachine: (auto-113537) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:46:01.191464   23695 main.go:134] libmachine: (auto-113537) DBG | hyperkit pid from json: 23706
	I1128 11:46:01.192277   23695 main.go:134] libmachine: (auto-113537) DBG | Searching for b6:f5:64:15:38:f8 in /var/db/dhcpd_leases ...
	I1128 11:46:01.192350   23695 main.go:134] libmachine: (auto-113537) DBG | Found 71 entries in /var/db/dhcpd_leases!
	I1128 11:46:01.192361   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.72 HWAddress:52:8e:82:73:2d:5d ID:1,52:8e:82:73:2d:5d Lease:0x63850ff3}
	I1128 11:46:01.192373   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.71 HWAddress:82:25:bb:13:22:fb ID:1,82:25:bb:13:22:fb Lease:0x63866129}
	I1128 11:46:01.192387   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.70 HWAddress:fa:3e:bb:ec:c:fa ID:1,fa:3e:bb:ec:c:fa Lease:0x63850fce}
	I1128 11:46:01.192396   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.69 HWAddress:1a:4b:c7:db:c5:27 ID:1,1a:4b:c7:db:c5:27 Lease:0x63866083}
	I1128 11:46:01.192405   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.68 HWAddress:6a:d5:2a:9d:e3:ec ID:1,6a:d5:2a:9d:e3:ec Lease:0x638660e0}
	I1128 11:46:01.192417   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.67 HWAddress:d2:b6:13:e9:ec:fb ID:1,d2:b6:13:e9:ec:fb Lease:0x63850ed1}
	I1128 11:46:01.192424   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.66 HWAddress:4e:65:48:b0:42:f7 ID:1,4e:65:48:b0:42:f7 Lease:0x63865f87}
	I1128 11:46:01.192431   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.65 HWAddress:a:c0:d2:86:d1:de ID:1,a:c0:d2:86:d1:de Lease:0x63865f7c}
	I1128 11:46:01.192438   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.64 HWAddress:a2:40:8b:72:14:1e ID:1,a2:40:8b:72:14:1e Lease:0x63850dfd}
	I1128 11:46:01.192449   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.63 HWAddress:e6:49:9d:db:e8:47 ID:1,e6:49:9d:db:e8:47 Lease:0x63850df2}
	I1128 11:46:01.192460   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.62 HWAddress:32:8b:8b:9c:c:60 ID:1,32:8b:8b:9c:c:60 Lease:0x63850dc6}
	I1128 11:46:01.192467   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.61 HWAddress:36:fb:4d:2c:8:50 ID:1,36:fb:4d:2c:8:50 Lease:0x63865f12}
	I1128 11:46:01.192474   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.60 HWAddress:76:3a:56:4a:cd:30 ID:1,76:3a:56:4a:cd:30 Lease:0x63865ec4}
	I1128 11:46:01.192481   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.59 HWAddress:72:47:b5:ba:26:2d ID:1,72:47:b5:ba:26:2d Lease:0x63850d1d}
	I1128 11:46:01.192491   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.58 HWAddress:c6:fa:2c:41:9:47 ID:1,c6:fa:2c:41:9:47 Lease:0x63865dc0}
	I1128 11:46:01.192501   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.57 HWAddress:9e:67:b5:77:d3:1d ID:1,9e:67:b5:77:d3:1d Lease:0x63865d8c}
	I1128 11:46:01.192510   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.56 HWAddress:86:cc:24:51:6b:6e ID:1,86:cc:24:51:6b:6e Lease:0x638509d0}
	I1128 11:46:01.192517   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.55 HWAddress:42:36:92:b8:85:fa ID:1,42:36:92:b8:85:fa Lease:0x63850c35}
	I1128 11:46:01.192524   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.54 HWAddress:f2:f1:f4:12:c2:7c ID:1,f2:f1:f4:12:c2:7c Lease:0x63850c33}
	I1128 11:46:01.192554   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.53 HWAddress:12:89:36:a9:8d:3d ID:1,12:89:36:a9:8d:3d Lease:0x6385057f}
	I1128 11:46:01.192568   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.52 HWAddress:b6:93:42:e7:68:e2 ID:1,b6:93:42:e7:68:e2 Lease:0x63850569}
	I1128 11:46:01.192577   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.51 HWAddress:52:78:bf:d2:77:fd ID:1,52:78:bf:d2:77:fd Lease:0x63850540}
	I1128 11:46:01.192584   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.50 HWAddress:8a:c3:5c:7e:b1:4d ID:1,8a:c3:5c:7e:b1:4d Lease:0x63850544}
	I1128 11:46:01.192593   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.49 HWAddress:b6:44:15:6e:57:97 ID:1,b6:44:15:6e:57:97 Lease:0x63865622}
	I1128 11:46:01.192600   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.48 HWAddress:f2:23:ae:44:4:29 ID:1,f2:23:ae:44:4:29 Lease:0x638655a2}
	I1128 11:46:01.192608   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.47 HWAddress:16:4b:33:ee:29:c1 ID:1,16:4b:33:ee:29:c1 Lease:0x6386549d}
	I1128 11:46:01.192619   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.46 HWAddress:62:45:cd:91:58:e8 ID:1,62:45:cd:91:58:e8 Lease:0x63865465}
	I1128 11:46:01.192626   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.45 HWAddress:f2:6:3c:74:fc:db ID:1,f2:6:3c:74:fc:db Lease:0x638502db}
	I1128 11:46:01.192635   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.44 HWAddress:2a:90:7d:b1:f7:11 ID:1,2a:90:7d:b1:f7:11 Lease:0x6386517b}
	I1128 11:46:01.192645   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.43 HWAddress:4e:57:a2:36:3:67 ID:1,4e:57:a2:36:3:67 Lease:0x63865046}
	I1128 11:46:01.192655   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.42 HWAddress:26:2f:63:2b:d4:a5 ID:1,26:2f:63:2b:d4:a5 Lease:0x63864fe8}
	I1128 11:46:01.192663   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.41 HWAddress:5a:14:d7:4:7:2 ID:1,5a:14:d7:4:7:2 Lease:0x63864e46}
	I1128 11:46:01.192670   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.40 HWAddress:fa:d7:14:9c:bb:c0 ID:1,fa:d7:14:9c:bb:c0 Lease:0x63864e20}
	I1128 11:46:01.192686   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.39 HWAddress:f2:29:bb:45:2e:e2 ID:1,f2:29:bb:45:2e:e2 Lease:0x63864d0d}
	I1128 11:46:01.192698   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.38 HWAddress:ca:69:15:23:d2:a5 ID:1,ca:69:15:23:d2:a5 Lease:0x63864cc3}
	I1128 11:46:01.192707   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.37 HWAddress:5e:51:f3:a2:6:5f ID:1,5e:51:f3:a2:6:5f Lease:0x6384fbed}
	I1128 11:46:01.192713   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.36 HWAddress:aa:58:3a:d5:65:33 ID:1,aa:58:3a:d5:65:33 Lease:0x63864c67}
	I1128 11:46:01.192726   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.35 HWAddress:b2:f2:ea:1f:74:e ID:1,b2:f2:ea:1f:74:e Lease:0x6384fade}
	I1128 11:46:01.192741   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.34 HWAddress:36:f:68:69:52:3d ID:1,36:f:68:69:52:3d Lease:0x63864bca}
	I1128 11:46:01.192764   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.33 HWAddress:e:f3:df:c3:c2:b9 ID:1,e:f3:df:c3:c2:b9 Lease:0x6384fb39}
	I1128 11:46:01.192778   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.32 HWAddress:e2:67:c6:ab:e7:ea ID:1,e2:67:c6:ab:e7:ea Lease:0x6384fa41}
	I1128 11:46:01.192789   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.31 HWAddress:1e:a5:be:ca:98:56 ID:1,1e:a5:be:ca:98:56 Lease:0x63864b0c}
	I1128 11:46:01.192797   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.30 HWAddress:5a:ae:ab:90:bf:2b ID:1,5a:ae:ab:90:bf:2b Lease:0x6384f9c9}
	I1128 11:46:01.192812   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.29 HWAddress:1e:25:1a:be:8d:fd ID:1,1e:25:1a:be:8d:fd Lease:0x6384f977}
	I1128 11:46:01.192821   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.28 HWAddress:e6:9a:10:9a:9d:57 ID:1,e6:9a:10:9a:9d:57 Lease:0x6384f94d}
	I1128 11:46:01.192832   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.27 HWAddress:62:93:5e:a6:9d:67 ID:1,62:93:5e:a6:9d:67 Lease:0x63864a75}
	I1128 11:46:01.192841   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.26 HWAddress:b6:e3:fd:79:5c:f2 ID:1,b6:e3:fd:79:5c:f2 Lease:0x63864a69}
	I1128 11:46:01.192848   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.25 HWAddress:ae:1d:54:5d:a8:a9 ID:1,ae:1d:54:5d:a8:a9 Lease:0x6384f8eb}
	I1128 11:46:01.192857   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.24 HWAddress:ee:75:ae:dd:5:e ID:1,ee:75:ae:dd:5:e Lease:0x6386493d}
	I1128 11:46:01.192864   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.23 HWAddress:f6:ed:f4:c4:fa:ab ID:1,f6:ed:f4:c4:fa:ab Lease:0x63864903}
	I1128 11:46:01.192870   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.22 HWAddress:26:24:eb:25:b4:5e ID:1,26:24:eb:25:b4:5e Lease:0x638648e0}
	I1128 11:46:01.192877   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.21 HWAddress:96:d3:21:59:8c:6e ID:1,96:d3:21:59:8c:6e Lease:0x638648d2}
	I1128 11:46:01.192884   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.20 HWAddress:d6:50:28:3:34:52 ID:1,d6:50:28:3:34:52 Lease:0x6384f751}
	I1128 11:46:01.192892   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.19 HWAddress:b2:8c:dc:88:c8:6f ID:1,b2:8c:dc:88:c8:6f Lease:0x6386487f}
	I1128 11:46:01.192899   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:4a:6:df:57:8c:60 ID:1,4a:6:df:57:8c:60 Lease:0x6386486f}
	I1128 11:46:01.192913   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:ca:dd:8e:48:25:9b ID:1,ca:dd:8e:48:25:9b Lease:0x6384f6e4}
	I1128 11:46:01.192921   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:3e:ab:e9:af:c5:1d ID:1,3e:ab:e9:af:c5:1d Lease:0x6384f67b}
	I1128 11:46:01.192929   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:4e:29:d6:1a:69:c6 ID:1,4e:29:d6:1a:69:c6 Lease:0x63864724}
	I1128 11:46:01.192936   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:42:bd:4c:e7:cd:a6 ID:1,42:bd:4c:e7:cd:a6 Lease:0x6385001b}
	I1128 11:46:01.192944   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:4e:39:2d:ba:b9:1f ID:1,4e:39:2d:ba:b9:1f Lease:0x6384f355}
	I1128 11:46:01.192953   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:aa:18:b6:fc:5:66 ID:1,aa:18:b6:fc:5:66 Lease:0x6384f59a}
	I1128 11:46:01.192965   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:a2:0:cf:d4:a4:1e ID:1,a2:0:cf:d4:a4:1e Lease:0x6384f598}
	I1128 11:46:01.192975   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:a6:b4:46:e:86:bb ID:1,a6:b4:46:e:86:bb Lease:0x6384ee86}
	I1128 11:46:01.192991   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:ca:e9:48:5e:17:79 ID:1,ca:e9:48:5e:17:79 Lease:0x6384ee70}
	I1128 11:46:01.193006   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:aa:9e:45:37:85:ba ID:1,aa:9e:45:37:85:ba Lease:0x6384ee4a}
	I1128 11:46:01.193015   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:a:8d:65:60:cd:d8 ID:1,a:8d:65:60:cd:d8 Lease:0x63863f7b}
	I1128 11:46:01.193029   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:c6:57:55:bb:25:56 ID:1,c6:57:55:bb:25:56 Lease:0x63863f36}
	I1128 11:46:01.193040   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:6:81:84:1b:41:e8 ID:1,6:81:84:1b:41:e8 Lease:0x63863ec5}
	I1128 11:46:01.193050   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:46:e9:d4:c0:3a:d3 ID:1,46:e9:d4:c0:3a:d3 Lease:0x63863dcb}
	I1128 11:46:01.193071   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:5e:da:9c:e4:19:67 ID:1,5e:da:9c:e4:19:67 Lease:0x6384ec41}
	I1128 11:46:01.193089   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:fe:7d:42:ad:6c:69 ID:1,fe:7d:42:ad:6c:69 Lease:0x6384ec0f}
	I1128 11:46:00.045962   23619 pod_ready.go:102] pod "kube-apiserver-pause-114428" in "kube-system" namespace has status "Ready":"False"
	I1128 11:46:02.545073   23619 pod_ready.go:102] pod "kube-apiserver-pause-114428" in "kube-system" namespace has status "Ready":"False"
	I1128 11:46:02.074552   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:46:02 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I1128 11:46:02.074660   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:46:02 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I1128 11:46:02.074670   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:46:02 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I1128 11:46:03.194401   23695 main.go:134] libmachine: (auto-113537) DBG | Attempt 3
	I1128 11:46:03.194417   23695 main.go:134] libmachine: (auto-113537) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:46:03.194477   23695 main.go:134] libmachine: (auto-113537) DBG | hyperkit pid from json: 23706
	I1128 11:46:03.195494   23695 main.go:134] libmachine: (auto-113537) DBG | Searching for b6:f5:64:15:38:f8 in /var/db/dhcpd_leases ...
	I1128 11:46:03.195586   23695 main.go:134] libmachine: (auto-113537) DBG | Found 71 entries in /var/db/dhcpd_leases!
	I1128 11:46:03.195595   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.72 HWAddress:52:8e:82:73:2d:5d ID:1,52:8e:82:73:2d:5d Lease:0x63850ff3}
	I1128 11:46:03.195605   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.71 HWAddress:82:25:bb:13:22:fb ID:1,82:25:bb:13:22:fb Lease:0x63866129}
	I1128 11:46:03.195619   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.70 HWAddress:fa:3e:bb:ec:c:fa ID:1,fa:3e:bb:ec:c:fa Lease:0x63850fce}
	I1128 11:46:03.195628   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.69 HWAddress:1a:4b:c7:db:c5:27 ID:1,1a:4b:c7:db:c5:27 Lease:0x63866083}
	I1128 11:46:03.195635   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.68 HWAddress:6a:d5:2a:9d:e3:ec ID:1,6a:d5:2a:9d:e3:ec Lease:0x638660e0}
	I1128 11:46:03.195646   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.67 HWAddress:d2:b6:13:e9:ec:fb ID:1,d2:b6:13:e9:ec:fb Lease:0x63850ed1}
	I1128 11:46:03.195656   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.66 HWAddress:4e:65:48:b0:42:f7 ID:1,4e:65:48:b0:42:f7 Lease:0x63865f87}
	I1128 11:46:03.195662   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.65 HWAddress:a:c0:d2:86:d1:de ID:1,a:c0:d2:86:d1:de Lease:0x63865f7c}
	I1128 11:46:03.195670   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.64 HWAddress:a2:40:8b:72:14:1e ID:1,a2:40:8b:72:14:1e Lease:0x63850dfd}
	I1128 11:46:03.195683   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.63 HWAddress:e6:49:9d:db:e8:47 ID:1,e6:49:9d:db:e8:47 Lease:0x63850df2}
	I1128 11:46:03.195693   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.62 HWAddress:32:8b:8b:9c:c:60 ID:1,32:8b:8b:9c:c:60 Lease:0x63850dc6}
	I1128 11:46:03.195702   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.61 HWAddress:36:fb:4d:2c:8:50 ID:1,36:fb:4d:2c:8:50 Lease:0x63865f12}
	I1128 11:46:03.195710   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.60 HWAddress:76:3a:56:4a:cd:30 ID:1,76:3a:56:4a:cd:30 Lease:0x63865ec4}
	I1128 11:46:03.195716   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.59 HWAddress:72:47:b5:ba:26:2d ID:1,72:47:b5:ba:26:2d Lease:0x63850d1d}
	I1128 11:46:03.195725   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.58 HWAddress:c6:fa:2c:41:9:47 ID:1,c6:fa:2c:41:9:47 Lease:0x63865dc0}
	I1128 11:46:03.195736   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.57 HWAddress:9e:67:b5:77:d3:1d ID:1,9e:67:b5:77:d3:1d Lease:0x63865d8c}
	I1128 11:46:03.195745   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.56 HWAddress:86:cc:24:51:6b:6e ID:1,86:cc:24:51:6b:6e Lease:0x638509d0}
	I1128 11:46:03.195753   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.55 HWAddress:42:36:92:b8:85:fa ID:1,42:36:92:b8:85:fa Lease:0x63850c35}
	I1128 11:46:03.195762   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.54 HWAddress:f2:f1:f4:12:c2:7c ID:1,f2:f1:f4:12:c2:7c Lease:0x63850c33}
	I1128 11:46:03.195771   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.53 HWAddress:12:89:36:a9:8d:3d ID:1,12:89:36:a9:8d:3d Lease:0x6385057f}
	I1128 11:46:03.195788   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.52 HWAddress:b6:93:42:e7:68:e2 ID:1,b6:93:42:e7:68:e2 Lease:0x63850569}
	I1128 11:46:03.195795   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.51 HWAddress:52:78:bf:d2:77:fd ID:1,52:78:bf:d2:77:fd Lease:0x63850540}
	I1128 11:46:03.195803   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.50 HWAddress:8a:c3:5c:7e:b1:4d ID:1,8a:c3:5c:7e:b1:4d Lease:0x63850544}
	I1128 11:46:03.195811   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.49 HWAddress:b6:44:15:6e:57:97 ID:1,b6:44:15:6e:57:97 Lease:0x63865622}
	I1128 11:46:03.195822   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.48 HWAddress:f2:23:ae:44:4:29 ID:1,f2:23:ae:44:4:29 Lease:0x638655a2}
	I1128 11:46:03.195831   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.47 HWAddress:16:4b:33:ee:29:c1 ID:1,16:4b:33:ee:29:c1 Lease:0x6386549d}
	I1128 11:46:03.195840   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.46 HWAddress:62:45:cd:91:58:e8 ID:1,62:45:cd:91:58:e8 Lease:0x63865465}
	I1128 11:46:03.195848   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.45 HWAddress:f2:6:3c:74:fc:db ID:1,f2:6:3c:74:fc:db Lease:0x638502db}
	I1128 11:46:03.195856   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.44 HWAddress:2a:90:7d:b1:f7:11 ID:1,2a:90:7d:b1:f7:11 Lease:0x6386517b}
	I1128 11:46:03.195865   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.43 HWAddress:4e:57:a2:36:3:67 ID:1,4e:57:a2:36:3:67 Lease:0x63865046}
	I1128 11:46:03.195874   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.42 HWAddress:26:2f:63:2b:d4:a5 ID:1,26:2f:63:2b:d4:a5 Lease:0x63864fe8}
	I1128 11:46:03.195881   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.41 HWAddress:5a:14:d7:4:7:2 ID:1,5a:14:d7:4:7:2 Lease:0x63864e46}
	I1128 11:46:03.195889   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.40 HWAddress:fa:d7:14:9c:bb:c0 ID:1,fa:d7:14:9c:bb:c0 Lease:0x63864e20}
	I1128 11:46:03.195897   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.39 HWAddress:f2:29:bb:45:2e:e2 ID:1,f2:29:bb:45:2e:e2 Lease:0x63864d0d}
	I1128 11:46:03.195905   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.38 HWAddress:ca:69:15:23:d2:a5 ID:1,ca:69:15:23:d2:a5 Lease:0x63864cc3}
	I1128 11:46:03.195929   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.37 HWAddress:5e:51:f3:a2:6:5f ID:1,5e:51:f3:a2:6:5f Lease:0x6384fbed}
	I1128 11:46:03.195944   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.36 HWAddress:aa:58:3a:d5:65:33 ID:1,aa:58:3a:d5:65:33 Lease:0x63864c67}
	I1128 11:46:03.195953   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.35 HWAddress:b2:f2:ea:1f:74:e ID:1,b2:f2:ea:1f:74:e Lease:0x6384fade}
	I1128 11:46:03.195962   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.34 HWAddress:36:f:68:69:52:3d ID:1,36:f:68:69:52:3d Lease:0x63864bca}
	I1128 11:46:03.195970   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.33 HWAddress:e:f3:df:c3:c2:b9 ID:1,e:f3:df:c3:c2:b9 Lease:0x6384fb39}
	I1128 11:46:03.195978   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.32 HWAddress:e2:67:c6:ab:e7:ea ID:1,e2:67:c6:ab:e7:ea Lease:0x6384fa41}
	I1128 11:46:03.195986   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.31 HWAddress:1e:a5:be:ca:98:56 ID:1,1e:a5:be:ca:98:56 Lease:0x63864b0c}
	I1128 11:46:03.195994   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.30 HWAddress:5a:ae:ab:90:bf:2b ID:1,5a:ae:ab:90:bf:2b Lease:0x6384f9c9}
	I1128 11:46:03.196001   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.29 HWAddress:1e:25:1a:be:8d:fd ID:1,1e:25:1a:be:8d:fd Lease:0x6384f977}
	I1128 11:46:03.196011   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.28 HWAddress:e6:9a:10:9a:9d:57 ID:1,e6:9a:10:9a:9d:57 Lease:0x6384f94d}
	I1128 11:46:03.196020   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.27 HWAddress:62:93:5e:a6:9d:67 ID:1,62:93:5e:a6:9d:67 Lease:0x63864a75}
	I1128 11:46:03.196028   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.26 HWAddress:b6:e3:fd:79:5c:f2 ID:1,b6:e3:fd:79:5c:f2 Lease:0x63864a69}
	I1128 11:46:03.196038   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.25 HWAddress:ae:1d:54:5d:a8:a9 ID:1,ae:1d:54:5d:a8:a9 Lease:0x6384f8eb}
	I1128 11:46:03.196046   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.24 HWAddress:ee:75:ae:dd:5:e ID:1,ee:75:ae:dd:5:e Lease:0x6386493d}
	I1128 11:46:03.196054   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.23 HWAddress:f6:ed:f4:c4:fa:ab ID:1,f6:ed:f4:c4:fa:ab Lease:0x63864903}
	I1128 11:46:03.196062   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.22 HWAddress:26:24:eb:25:b4:5e ID:1,26:24:eb:25:b4:5e Lease:0x638648e0}
	I1128 11:46:03.196073   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.21 HWAddress:96:d3:21:59:8c:6e ID:1,96:d3:21:59:8c:6e Lease:0x638648d2}
	I1128 11:46:03.196082   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.20 HWAddress:d6:50:28:3:34:52 ID:1,d6:50:28:3:34:52 Lease:0x6384f751}
	I1128 11:46:03.196090   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.19 HWAddress:b2:8c:dc:88:c8:6f ID:1,b2:8c:dc:88:c8:6f Lease:0x6386487f}
	I1128 11:46:03.196098   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:4a:6:df:57:8c:60 ID:1,4a:6:df:57:8c:60 Lease:0x6386486f}
	I1128 11:46:03.196106   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:ca:dd:8e:48:25:9b ID:1,ca:dd:8e:48:25:9b Lease:0x6384f6e4}
	I1128 11:46:03.196114   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:3e:ab:e9:af:c5:1d ID:1,3e:ab:e9:af:c5:1d Lease:0x6384f67b}
	I1128 11:46:03.196124   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:4e:29:d6:1a:69:c6 ID:1,4e:29:d6:1a:69:c6 Lease:0x63864724}
	I1128 11:46:03.196133   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:42:bd:4c:e7:cd:a6 ID:1,42:bd:4c:e7:cd:a6 Lease:0x6385001b}
	I1128 11:46:03.196141   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:4e:39:2d:ba:b9:1f ID:1,4e:39:2d:ba:b9:1f Lease:0x6384f355}
	I1128 11:46:03.196149   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:aa:18:b6:fc:5:66 ID:1,aa:18:b6:fc:5:66 Lease:0x6384f59a}
	I1128 11:46:03.196156   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:a2:0:cf:d4:a4:1e ID:1,a2:0:cf:d4:a4:1e Lease:0x6384f598}
	I1128 11:46:03.196164   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:a6:b4:46:e:86:bb ID:1,a6:b4:46:e:86:bb Lease:0x6384ee86}
	I1128 11:46:03.196171   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:ca:e9:48:5e:17:79 ID:1,ca:e9:48:5e:17:79 Lease:0x6384ee70}
	I1128 11:46:03.196180   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:aa:9e:45:37:85:ba ID:1,aa:9e:45:37:85:ba Lease:0x6384ee4a}
	I1128 11:46:03.196187   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:a:8d:65:60:cd:d8 ID:1,a:8d:65:60:cd:d8 Lease:0x63863f7b}
	I1128 11:46:03.196195   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:c6:57:55:bb:25:56 ID:1,c6:57:55:bb:25:56 Lease:0x63863f36}
	I1128 11:46:03.196208   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:6:81:84:1b:41:e8 ID:1,6:81:84:1b:41:e8 Lease:0x63863ec5}
	I1128 11:46:03.196217   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:46:e9:d4:c0:3a:d3 ID:1,46:e9:d4:c0:3a:d3 Lease:0x63863dcb}
	I1128 11:46:03.196225   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:5e:da:9c:e4:19:67 ID:1,5e:da:9c:e4:19:67 Lease:0x6384ec41}
	I1128 11:46:03.196232   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:fe:7d:42:ad:6c:69 ID:1,fe:7d:42:ad:6c:69 Lease:0x6384ec0f}
	I1128 11:46:05.197528   23695 main.go:134] libmachine: (auto-113537) DBG | Attempt 4
	I1128 11:46:05.197556   23695 main.go:134] libmachine: (auto-113537) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:46:05.197673   23695 main.go:134] libmachine: (auto-113537) DBG | hyperkit pid from json: 23706
	I1128 11:46:05.198552   23695 main.go:134] libmachine: (auto-113537) DBG | Searching for b6:f5:64:15:38:f8 in /var/db/dhcpd_leases ...
	I1128 11:46:05.198728   23695 main.go:134] libmachine: (auto-113537) DBG | Found 72 entries in /var/db/dhcpd_leases!
	I1128 11:46:05.198743   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.73 HWAddress:b6:f5:64:15:38:f8 ID:1,b6:f5:64:15:38:f8 Lease:0x6386617c}
	I1128 11:46:05.198757   23695 main.go:134] libmachine: (auto-113537) DBG | Found match: b6:f5:64:15:38:f8
	I1128 11:46:05.198764   23695 main.go:134] libmachine: (auto-113537) DBG | IP: 192.168.64.73
	I1128 11:46:05.198803   23695 main.go:134] libmachine: (auto-113537) Calling .GetConfigRaw
	I1128 11:46:05.199496   23695 main.go:134] libmachine: (auto-113537) Calling .DriverName
	I1128 11:46:05.199636   23695 main.go:134] libmachine: (auto-113537) Calling .DriverName
	I1128 11:46:05.199763   23695 main.go:134] libmachine: Waiting for machine to be running, this may take a few minutes...
	I1128 11:46:05.199776   23695 main.go:134] libmachine: (auto-113537) Calling .GetState
	I1128 11:46:05.199902   23695 main.go:134] libmachine: (auto-113537) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:46:05.199965   23695 main.go:134] libmachine: (auto-113537) DBG | hyperkit pid from json: 23706
	I1128 11:46:05.200645   23695 main.go:134] libmachine: Detecting operating system of created instance...
	I1128 11:46:05.200656   23695 main.go:134] libmachine: Waiting for SSH to be available...
	I1128 11:46:05.200663   23695 main.go:134] libmachine: Getting to WaitForSSH function...
	I1128 11:46:05.200672   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHHostname
	I1128 11:46:05.200797   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHPort
	I1128 11:46:05.200928   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHKeyPath
	I1128 11:46:05.201043   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHKeyPath
	I1128 11:46:05.201151   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHUsername
	I1128 11:46:05.201306   23695 main.go:134] libmachine: Using SSH client type: native
	I1128 11:46:05.201509   23695 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e7180] 0x13ea300 <nil>  [] 0s} 192.168.64.73 22 <nil> <nil>}
	I1128 11:46:05.201521   23695 main.go:134] libmachine: About to run SSH command:
	exit 0
	I1128 11:46:05.225652   23695 main.go:134] libmachine: Error dialing TCP: ssh: handshake failed: ssh: unable to authenticate, attempted methods [none publickey], no supported methods remain
	I1128 11:46:04.547145   23619 pod_ready.go:102] pod "kube-apiserver-pause-114428" in "kube-system" namespace has status "Ready":"False"
	I1128 11:46:06.048083   23619 pod_ready.go:92] pod "kube-apiserver-pause-114428" in "kube-system" namespace has status "Ready":"True"
	I1128 11:46:06.048097   23619 pod_ready.go:81] duration metric: took 8.009560784s waiting for pod "kube-apiserver-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:06.048103   23619 pod_ready.go:78] waiting up to 4m0s for pod "kube-controller-manager-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:06.051216   23619 pod_ready.go:92] pod "kube-controller-manager-pause-114428" in "kube-system" namespace has status "Ready":"True"
	I1128 11:46:06.051224   23619 pod_ready.go:81] duration metric: took 3.116595ms waiting for pod "kube-controller-manager-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:06.051230   23619 pod_ready.go:78] waiting up to 4m0s for pod "kube-proxy-gm2kh" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:06.053957   23619 pod_ready.go:92] pod "kube-proxy-gm2kh" in "kube-system" namespace has status "Ready":"True"
	I1128 11:46:06.053964   23619 pod_ready.go:81] duration metric: took 2.729921ms waiting for pod "kube-proxy-gm2kh" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:06.053969   23619 pod_ready.go:78] waiting up to 4m0s for pod "kube-scheduler-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:06.562996   23619 pod_ready.go:92] pod "kube-scheduler-pause-114428" in "kube-system" namespace has status "Ready":"True"
	I1128 11:46:06.563009   23619 pod_ready.go:81] duration metric: took 509.018441ms waiting for pod "kube-scheduler-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:06.563015   23619 pod_ready.go:38] duration metric: took 13.542714077s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I1128 11:46:06.563028   23619 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I1128 11:46:06.570229   23619 ops.go:34] apiserver oom_adj: -16
	I1128 11:46:06.570238   23619 kubeadm.go:631] restartCluster took 25.135447778s
	I1128 11:46:06.570243   23619 kubeadm.go:398] StartCluster complete in 25.15757485s
	I1128 11:46:06.570252   23619 settings.go:142] acquiring lock: {Name:mk7392d5e25d999df834aabe5db592398d1f845f Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1128 11:46:06.570337   23619 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/15411-14646/kubeconfig
	I1128 11:46:06.570709   23619 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/15411-14646/kubeconfig: {Name:mk58f8fa3810393ea66460d5cf44fc66020c4987 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1128 11:46:06.571302   23619 kapi.go:59] client config for pause-114428: &rest.Config{Host:"https://192.168.64.71:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/pause-114428/client.crt", KeyFile:"/Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/pause-114428/client.key", CAFile:"/Users/jenkins/minikube-integration/15411-14646/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]
string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x2356c80), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1128 11:46:06.573288   23619 kapi.go:244] deployment "coredns" in namespace "kube-system" and context "pause-114428" rescaled to 1
	I1128 11:46:06.573316   23619 start.go:212] Will wait 6m0s for node &{Name: IP:192.168.64.71 Port:8443 KubernetesVersion:v1.25.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I1128 11:46:06.573325   23619 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.25.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I1128 11:46:06.595199   23619 out.go:177] * Verifying Kubernetes components...
	I1128 11:46:06.573354   23619 addons.go:486] enableAddons start: toEnable=map[], additional=[]
	I1128 11:46:06.573498   23619 config.go:180] Loaded profile config "pause-114428": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.3
	I1128 11:46:06.629592   23619 start.go:806] CoreDNS already contains "host.minikube.internal" host record, skipping...
	I1128 11:46:06.637368   23619 addons.go:65] Setting default-storageclass=true in profile "pause-114428"
	I1128 11:46:06.637368   23619 addons.go:65] Setting storage-provisioner=true in profile "pause-114428"
	I1128 11:46:06.637416   23619 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "pause-114428"
	I1128 11:46:06.637436   23619 addons.go:227] Setting addon storage-provisioner=true in "pause-114428"
	I1128 11:46:06.637382   23619 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	W1128 11:46:06.637448   23619 addons.go:236] addon storage-provisioner should already be in state true
	I1128 11:46:06.637510   23619 host.go:66] Checking if "pause-114428" exists ...
	I1128 11:46:06.637835   23619 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:46:06.637860   23619 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 11:46:06.637905   23619 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:46:06.637925   23619 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 11:46:06.645886   23619 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58739
	I1128 11:46:06.646195   23619 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58741
	I1128 11:46:06.646301   23619 main.go:134] libmachine: () Calling .GetVersion
	I1128 11:46:06.646561   23619 main.go:134] libmachine: () Calling .GetVersion
	I1128 11:46:06.646698   23619 main.go:134] libmachine: Using API Version  1
	I1128 11:46:06.646713   23619 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 11:46:06.646891   23619 main.go:134] libmachine: Using API Version  1
	I1128 11:46:06.646902   23619 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 11:46:06.646912   23619 main.go:134] libmachine: () Calling .GetMachineName
	I1128 11:46:06.647097   23619 main.go:134] libmachine: () Calling .GetMachineName
	I1128 11:46:06.647214   23619 main.go:134] libmachine: (pause-114428) Calling .GetState
	I1128 11:46:06.647287   23619 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:46:06.647304   23619 main.go:134] libmachine: (pause-114428) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:46:06.647310   23619 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 11:46:06.647389   23619 main.go:134] libmachine: (pause-114428) DBG | hyperkit pid from json: 23529
	I1128 11:46:06.648684   23619 node_ready.go:35] waiting up to 6m0s for node "pause-114428" to be "Ready" ...
	I1128 11:46:06.649127   23619 kapi.go:59] client config for pause-114428: &rest.Config{Host:"https://192.168.64.71:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/pause-114428/client.crt", KeyFile:"/Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/pause-114428/client.key", CAFile:"/Users/jenkins/minikube-integration/15411-14646/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]
string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x2356c80), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1128 11:46:06.650857   23619 node_ready.go:49] node "pause-114428" has status "Ready":"True"
	I1128 11:46:06.650866   23619 node_ready.go:38] duration metric: took 2.162932ms waiting for node "pause-114428" to be "Ready" ...
	I1128 11:46:06.650871   23619 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I1128 11:46:06.651663   23619 addons.go:227] Setting addon default-storageclass=true in "pause-114428"
	W1128 11:46:06.651672   23619 addons.go:236] addon default-storageclass should already be in state true
	I1128 11:46:06.651690   23619 host.go:66] Checking if "pause-114428" exists ...
	I1128 11:46:06.651949   23619 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:46:06.651975   23619 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 11:46:06.654611   23619 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58743
	I1128 11:46:06.654971   23619 main.go:134] libmachine: () Calling .GetVersion
	I1128 11:46:06.655291   23619 main.go:134] libmachine: Using API Version  1
	I1128 11:46:06.655304   23619 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 11:46:06.655510   23619 main.go:134] libmachine: () Calling .GetMachineName
	I1128 11:46:06.655602   23619 pod_ready.go:78] waiting up to 6m0s for pod "coredns-565d847f94-hstbw" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:06.655605   23619 main.go:134] libmachine: (pause-114428) Calling .GetState
	I1128 11:46:06.655686   23619 main.go:134] libmachine: (pause-114428) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:46:06.655766   23619 main.go:134] libmachine: (pause-114428) DBG | hyperkit pid from json: 23529
	I1128 11:46:06.656669   23619 main.go:134] libmachine: (pause-114428) Calling .DriverName
	I1128 11:46:06.659108   23619 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58745
	I1128 11:46:06.678307   23619 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1128 11:46:06.679132   23619 main.go:134] libmachine: () Calling .GetVersion
	I1128 11:46:06.699526   23619 addons.go:419] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1128 11:46:06.699546   23619 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1128 11:46:06.699568   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHHostname
	I1128 11:46:06.699800   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHPort
	I1128 11:46:06.700032   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHKeyPath
	I1128 11:46:06.700176   23619 main.go:134] libmachine: Using API Version  1
	I1128 11:46:06.700207   23619 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 11:46:06.700229   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHUsername
	I1128 11:46:06.700424   23619 sshutil.go:53] new ssh client: &{IP:192.168.64.71 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/pause-114428/id_rsa Username:docker}
	I1128 11:46:06.700648   23619 main.go:134] libmachine: () Calling .GetMachineName
	I1128 11:46:06.701359   23619 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:46:06.701397   23619 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 11:46:06.709163   23619 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58748
	I1128 11:46:06.709518   23619 main.go:134] libmachine: () Calling .GetVersion
	I1128 11:46:06.709864   23619 main.go:134] libmachine: Using API Version  1
	I1128 11:46:06.709878   23619 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 11:46:06.710074   23619 main.go:134] libmachine: () Calling .GetMachineName
	I1128 11:46:06.710166   23619 main.go:134] libmachine: (pause-114428) Calling .GetState
	I1128 11:46:06.710244   23619 main.go:134] libmachine: (pause-114428) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:46:06.710318   23619 main.go:134] libmachine: (pause-114428) DBG | hyperkit pid from json: 23529
	I1128 11:46:06.711180   23619 main.go:134] libmachine: (pause-114428) Calling .DriverName
	I1128 11:46:06.711324   23619 addons.go:419] installing /etc/kubernetes/addons/storageclass.yaml
	I1128 11:46:06.711332   23619 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1128 11:46:06.711341   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHHostname
	I1128 11:46:06.711421   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHPort
	I1128 11:46:06.711501   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHKeyPath
	I1128 11:46:06.711570   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHUsername
	I1128 11:46:06.711656   23619 sshutil.go:53] new ssh client: &{IP:192.168.64.71 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/pause-114428/id_rsa Username:docker}
	I1128 11:46:06.752029   23619 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1128 11:46:06.761901   23619 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.3/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1128 11:46:06.844170   23619 pod_ready.go:92] pod "coredns-565d847f94-hstbw" in "kube-system" namespace has status "Ready":"True"
	I1128 11:46:06.844181   23619 pod_ready.go:81] duration metric: took 188.565794ms waiting for pod "coredns-565d847f94-hstbw" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:06.844189   23619 pod_ready.go:78] waiting up to 6m0s for pod "etcd-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:07.245650   23619 pod_ready.go:92] pod "etcd-pause-114428" in "kube-system" namespace has status "Ready":"True"
	I1128 11:46:07.245661   23619 pod_ready.go:81] duration metric: took 401.459797ms waiting for pod "etcd-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:07.245670   23619 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:07.358355   23619 main.go:134] libmachine: Making call to close driver server
	I1128 11:46:07.358369   23619 main.go:134] libmachine: (pause-114428) Calling .Close
	I1128 11:46:07.358466   23619 main.go:134] libmachine: Making call to close driver server
	I1128 11:46:07.358476   23619 main.go:134] libmachine: (pause-114428) Calling .Close
	I1128 11:46:07.358539   23619 main.go:134] libmachine: Successfully made call to close driver server
	I1128 11:46:07.358554   23619 main.go:134] libmachine: Making call to close connection to plugin binary
	I1128 11:46:07.358561   23619 main.go:134] libmachine: Making call to close driver server
	I1128 11:46:07.358569   23619 main.go:134] libmachine: (pause-114428) Calling .Close
	I1128 11:46:07.358578   23619 main.go:134] libmachine: (pause-114428) DBG | Closing plugin on server side
	I1128 11:46:07.358605   23619 main.go:134] libmachine: Successfully made call to close driver server
	I1128 11:46:07.358613   23619 main.go:134] libmachine: Making call to close connection to plugin binary
	I1128 11:46:07.358622   23619 main.go:134] libmachine: Making call to close driver server
	I1128 11:46:07.358628   23619 main.go:134] libmachine: (pause-114428) Calling .Close
	I1128 11:46:07.358710   23619 main.go:134] libmachine: Successfully made call to close driver server
	I1128 11:46:07.358722   23619 main.go:134] libmachine: Making call to close connection to plugin binary
	I1128 11:46:07.358722   23619 main.go:134] libmachine: (pause-114428) DBG | Closing plugin on server side
	I1128 11:46:07.358738   23619 main.go:134] libmachine: Making call to close driver server
	I1128 11:46:07.358754   23619 main.go:134] libmachine: (pause-114428) Calling .Close
	I1128 11:46:07.358877   23619 main.go:134] libmachine: (pause-114428) DBG | Closing plugin on server side
	I1128 11:46:07.358891   23619 main.go:134] libmachine: Successfully made call to close driver server
	I1128 11:46:07.358932   23619 main.go:134] libmachine: Making call to close connection to plugin binary
	I1128 11:46:07.358981   23619 main.go:134] libmachine: Successfully made call to close driver server
	I1128 11:46:07.359000   23619 main.go:134] libmachine: Making call to close connection to plugin binary
	I1128 11:46:07.359083   23619 main.go:134] libmachine: (pause-114428) DBG | Closing plugin on server side
	I1128 11:46:07.398686   23619 out.go:177] * Enabled addons: storage-provisioner, default-storageclass
	I1128 11:46:07.457752   23619 addons.go:488] enableAddons completed in 884.379177ms
	I1128 11:46:07.646777   23619 pod_ready.go:92] pod "kube-apiserver-pause-114428" in "kube-system" namespace has status "Ready":"True"
	I1128 11:46:07.646790   23619 pod_ready.go:81] duration metric: took 401.1063ms waiting for pod "kube-apiserver-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:07.646798   23619 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:08.043898   23619 pod_ready.go:92] pod "kube-controller-manager-pause-114428" in "kube-system" namespace has status "Ready":"True"
	I1128 11:46:08.043909   23619 pod_ready.go:81] duration metric: took 397.096432ms waiting for pod "kube-controller-manager-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:08.043915   23619 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-gm2kh" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:08.445110   23619 pod_ready.go:92] pod "kube-proxy-gm2kh" in "kube-system" namespace has status "Ready":"True"
	I1128 11:46:08.445121   23619 pod_ready.go:81] duration metric: took 401.192773ms waiting for pod "kube-proxy-gm2kh" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:08.445128   23619 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:08.844356   23619 pod_ready.go:92] pod "kube-scheduler-pause-114428" in "kube-system" namespace has status "Ready":"True"
	I1128 11:46:08.844366   23619 pod_ready.go:81] duration metric: took 399.225652ms waiting for pod "kube-scheduler-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:08.844372   23619 pod_ready.go:38] duration metric: took 2.193441179s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I1128 11:46:08.844385   23619 api_server.go:51] waiting for apiserver process to appear ...
	I1128 11:46:08.844445   23619 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1128 11:46:08.853453   23619 api_server.go:71] duration metric: took 2.280070722s to wait for apiserver process to appear ...
	I1128 11:46:08.853468   23619 api_server.go:87] waiting for apiserver healthz status ...
	I1128 11:46:08.853477   23619 api_server.go:252] Checking apiserver healthz at https://192.168.64.71:8443/healthz ...
	I1128 11:46:08.857396   23619 api_server.go:278] https://192.168.64.71:8443/healthz returned 200:
	ok
	I1128 11:46:08.857918   23619 api_server.go:140] control plane version: v1.25.3
	I1128 11:46:08.857929   23619 api_server.go:130] duration metric: took 4.455033ms to wait for apiserver health ...
	I1128 11:46:08.857934   23619 system_pods.go:43] waiting for kube-system pods to appear ...
	I1128 11:46:09.045441   23619 system_pods.go:59] 7 kube-system pods found
	I1128 11:46:09.045455   23619 system_pods.go:61] "coredns-565d847f94-hstbw" [21d18ebd-7324-4815-857d-aa2cea270e10] Running
	I1128 11:46:09.045459   23619 system_pods.go:61] "etcd-pause-114428" [e76c9996-d2c9-4c3e-bdd1-3b9a162b20d5] Running
	I1128 11:46:09.045463   23619 system_pods.go:61] "kube-apiserver-pause-114428" [ef055aee-e95a-4698-b207-8039ce03eefd] Running
	I1128 11:46:09.045466   23619 system_pods.go:61] "kube-controller-manager-pause-114428" [fb607845-660c-4078-937b-aa7f3841460e] Running
	I1128 11:46:09.045470   23619 system_pods.go:61] "kube-proxy-gm2kh" [5f48a047-d53f-4630-beec-4846d0327f1f] Running
	I1128 11:46:09.045473   23619 system_pods.go:61] "kube-scheduler-pause-114428" [d0a7e59a-952f-43b7-aef1-7f00b12e962a] Running
	I1128 11:46:09.045478   23619 system_pods.go:61] "storage-provisioner" [f3bb3ce4-fe67-4d5b-9a95-048c18b13469] Pending / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I1128 11:46:09.045482   23619 system_pods.go:74] duration metric: took 187.540087ms to wait for pod list to return data ...
	I1128 11:46:09.045488   23619 default_sa.go:34] waiting for default service account to be created ...
	I1128 11:46:09.243309   23619 default_sa.go:45] found service account: "default"
	I1128 11:46:09.243320   23619 default_sa.go:55] duration metric: took 197.823748ms for default service account to be created ...
	I1128 11:46:09.243326   23619 system_pods.go:116] waiting for k8s-apps to be running ...
	I1128 11:46:09.445692   23619 system_pods.go:86] 7 kube-system pods found
	I1128 11:46:09.445706   23619 system_pods.go:89] "coredns-565d847f94-hstbw" [21d18ebd-7324-4815-857d-aa2cea270e10] Running
	I1128 11:46:09.445711   23619 system_pods.go:89] "etcd-pause-114428" [e76c9996-d2c9-4c3e-bdd1-3b9a162b20d5] Running
	I1128 11:46:09.445714   23619 system_pods.go:89] "kube-apiserver-pause-114428" [ef055aee-e95a-4698-b207-8039ce03eefd] Running
	I1128 11:46:09.445718   23619 system_pods.go:89] "kube-controller-manager-pause-114428" [fb607845-660c-4078-937b-aa7f3841460e] Running
	I1128 11:46:09.445721   23619 system_pods.go:89] "kube-proxy-gm2kh" [5f48a047-d53f-4630-beec-4846d0327f1f] Running
	I1128 11:46:09.445725   23619 system_pods.go:89] "kube-scheduler-pause-114428" [d0a7e59a-952f-43b7-aef1-7f00b12e962a] Running
	I1128 11:46:09.445729   23619 system_pods.go:89] "storage-provisioner" [f3bb3ce4-fe67-4d5b-9a95-048c18b13469] Running
	I1128 11:46:09.445733   23619 system_pods.go:126] duration metric: took 202.39991ms to wait for k8s-apps to be running ...
	I1128 11:46:09.445739   23619 system_svc.go:44] waiting for kubelet service to be running ....
	I1128 11:46:09.445802   23619 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1128 11:46:09.455773   23619 system_svc.go:56] duration metric: took 10.02855ms WaitForService to wait for kubelet.
	I1128 11:46:09.455787   23619 kubeadm.go:573] duration metric: took 2.882394245s to wait for : map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] ...
	I1128 11:46:09.455798   23619 node_conditions.go:102] verifying NodePressure condition ...
	I1128 11:46:09.649312   23619 node_conditions.go:122] node storage ephemeral capacity is 17784752Ki
	I1128 11:46:09.649327   23619 node_conditions.go:123] node cpu capacity is 2
	I1128 11:46:09.649353   23619 node_conditions.go:105] duration metric: took 193.543357ms to run NodePressure ...
	I1128 11:46:09.649377   23619 start.go:217] waiting for startup goroutines ...
	I1128 11:46:09.649843   23619 ssh_runner.go:195] Run: rm -f paused
	I1128 11:46:09.693369   23619 start.go:535] kubectl: 1.25.2, cluster: 1.25.3 (minor skew: 0)
	I1128 11:46:09.719383   23619 out.go:177] * Done! kubectl is now configured to use "pause-114428" cluster and "default" namespace by default
	
	* 
	* ==> Docker <==
	* -- Journal begins at Mon 2022-11-28 19:44:41 UTC, ends at Mon 2022-11-28 19:46:10 UTC. --
	Nov 28 19:45:47 pause-114428 dockerd[3927]: time="2022-11-28T19:45:47.939073752Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/9ead6b6b65f5804f29c1c11bac735e78dba4e73dd0ea7bc34808289436343d3e pid=5532 runtime=io.containerd.runc.v2
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.098426358Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.098563598Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.098574562Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.099290725Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/a1f9e3b2cd7454357bddb17295f4a1e35da4a58f1edacbdeb921181688bff5bf pid=5709 runtime=io.containerd.runc.v2
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.190361701Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.190396367Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.190409497Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.190614418Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/20905b729742236666d48a2d1afd75a7a28ef485f06f7e052006225c0ca8226e pid=5751 runtime=io.containerd.runc.v2
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.470362281Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.470523701Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.470550421Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.470852155Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/849f8cddc58d7686ef772e9e1c8845a70fba101ad08600f47df0f8d455845008 pid=5861 runtime=io.containerd.runc.v2
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.923869140Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.923931529Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.923940891Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.924141542Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/e3ce284bd8adaaa9151908279892cbec5e5460b138a1fa7e6ec90e445f479e8c pid=5928 runtime=io.containerd.runc.v2
	Nov 28 19:46:07 pause-114428 dockerd[3927]: time="2022-11-28T19:46:07.889587265Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Nov 28 19:46:07 pause-114428 dockerd[3927]: time="2022-11-28T19:46:07.889645913Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Nov 28 19:46:07 pause-114428 dockerd[3927]: time="2022-11-28T19:46:07.889655186Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Nov 28 19:46:07 pause-114428 dockerd[3927]: time="2022-11-28T19:46:07.890181143Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/abce6d5dca595edd52e4a8355a41885daa2203a0494b505eef197f4b59c6878b pid=6156 runtime=io.containerd.runc.v2
	Nov 28 19:46:08 pause-114428 dockerd[3927]: time="2022-11-28T19:46:08.169171124Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Nov 28 19:46:08 pause-114428 dockerd[3927]: time="2022-11-28T19:46:08.169229784Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Nov 28 19:46:08 pause-114428 dockerd[3927]: time="2022-11-28T19:46:08.169239029Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Nov 28 19:46:08 pause-114428 dockerd[3927]: time="2022-11-28T19:46:08.169542788Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/ea817622d730befb6352b0a617af62b189382f856291405e1e4cae75244af470 pid=6204 runtime=io.containerd.runc.v2
	
	* 
	* ==> container status <==
	* CONTAINER           IMAGE               CREATED             STATE               NAME                      ATTEMPT             POD ID
	ea817622d730b       6e38f40d628db       2 seconds ago       Running             storage-provisioner       0                   abce6d5dca595
	e3ce284bd8ada       5185b96f0becf       18 seconds ago      Running             coredns                   2                   849f8cddc58d7
	20905b7297422       beaaf00edd38a       18 seconds ago      Running             kube-proxy                2                   a1f9e3b2cd745
	9ead6b6b65f58       0346dbd74bcb9       23 seconds ago      Running             kube-apiserver            3                   73d62198b6148
	2f771b319f03a       6039992312758       23 seconds ago      Running             kube-controller-manager   2                   cb2097f545b08
	f24fa2c61249f       6d23ec0e8b87e       23 seconds ago      Running             kube-scheduler            2                   7f1a6817926f8
	8ca995856c18a       a8a176a5d5d69       23 seconds ago      Running             etcd                      2                   be8a95ce71d21
	a9d36028607be       5185b96f0becf       25 seconds ago      Created             coredns                   1                   8e79d957b3d6d
	cb7ec3000bb98       0346dbd74bcb9       26 seconds ago      Created             kube-apiserver            2                   eca5b163aceeb
	7cf378f4872e2       6d23ec0e8b87e       34 seconds ago      Exited              kube-scheduler            1                   c2404a1daca8e
	cf3c8b0b6388f       a8a176a5d5d69       35 seconds ago      Exited              etcd                      1                   aaa7b0e997b98
	677c0d9be193a       6039992312758       36 seconds ago      Exited              kube-controller-manager   1                   71edc30d3dbb5
	d3c7e3e4259d7       beaaf00edd38a       36 seconds ago      Exited              kube-proxy                1                   81e0e82e398af
	
	* 
	* ==> coredns [a9d36028607b] <==
	* 
	* 
	* ==> coredns [e3ce284bd8ad] <==
	* .:53
	[INFO] plugin/reload: Running configuration SHA512 = 7135f430aea492809ab227b028bd16c96f6629e00404d9ec4f44cae029eb3743d1cfe4a9d0cc8fbbd4cfa53556972f2bbf615e7c9e8412e85d290539257166ad
	CoreDNS-1.9.3
	linux/amd64, go1.18.2, 45b0a11
	
	* 
	* ==> describe nodes <==
	* Name:               pause-114428
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=pause-114428
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=8ca606bc88c49b8633ca8bf16fc174bca0c3a74e
	                    minikube.k8s.io/name=pause-114428
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2022_11_28T11_45_12_0700
	                    minikube.k8s.io/version=v1.28.0
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Mon, 28 Nov 2022 19:45:11 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  pause-114428
	  AcquireTime:     <unset>
	  RenewTime:       Mon, 28 Nov 2022 19:46:01 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Mon, 28 Nov 2022 19:45:51 +0000   Mon, 28 Nov 2022 19:45:11 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Mon, 28 Nov 2022 19:45:51 +0000   Mon, 28 Nov 2022 19:45:11 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Mon, 28 Nov 2022 19:45:51 +0000   Mon, 28 Nov 2022 19:45:11 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Mon, 28 Nov 2022 19:45:51 +0000   Mon, 28 Nov 2022 19:45:12 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.64.71
	  Hostname:    pause-114428
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17784752Ki
	  hugepages-2Mi:      0
	  memory:             2017572Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17784752Ki
	  hugepages-2Mi:      0
	  memory:             2017572Ki
	  pods:               110
	System Info:
	  Machine ID:                 ec7b50837b2241e5af48c3dad42d792d
	  System UUID:                151111ed-0000-0000-94c8-f01898ef957c
	  Boot ID:                    a8c97c69-e70e-4c1a-82a4-f868cdd4b19a
	  Kernel Version:             5.10.57
	  OS Image:                   Buildroot 2021.02.12
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://20.10.21
	  Kubelet Version:            v1.25.3
	  Kube-Proxy Version:         v1.25.3
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (7 in total)
	  Namespace                   Name                                    CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                    ------------  ----------  ---------------  -------------  ---
	  kube-system                 coredns-565d847f94-hstbw                100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     47s
	  kube-system                 etcd-pause-114428                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (5%!)(MISSING)       0 (0%!)(MISSING)         59s
	  kube-system                 kube-apiserver-pause-114428             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         59s
	  kube-system                 kube-controller-manager-pause-114428    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         59s
	  kube-system                 kube-proxy-gm2kh                        0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         47s
	  kube-system                 kube-scheduler-pause-114428             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         59s
	  kube-system                 storage-provisioner                     0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  0 (0%!)(MISSING)
	  memory             170Mi (8%!)(MISSING)  170Mi (8%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                From             Message
	  ----    ------                   ----               ----             -------
	  Normal  Starting                 45s                kube-proxy       
	  Normal  Starting                 19s                kube-proxy       
	  Normal  NodeHasSufficientPID     59s                kubelet          Node pause-114428 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  59s                kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientMemory  59s                kubelet          Node pause-114428 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    59s                kubelet          Node pause-114428 status is now: NodeHasNoDiskPressure
	  Normal  NodeReady                59s                kubelet          Node pause-114428 status is now: NodeReady
	  Normal  Starting                 59s                kubelet          Starting kubelet.
	  Normal  RegisteredNode           47s                node-controller  Node pause-114428 event: Registered Node pause-114428 in Controller
	  Normal  Starting                 25s                kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  25s (x8 over 25s)  kubelet          Node pause-114428 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    25s (x8 over 25s)  kubelet          Node pause-114428 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     25s (x7 over 25s)  kubelet          Node pause-114428 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  25s                kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           8s                 node-controller  Node pause-114428 event: Registered Node pause-114428 in Controller
	
	* 
	* ==> dmesg <==
	* [  +0.000002] systemd[1]: (This warning is only shown for the first unit using IP firewalling.)
	[  +1.872776] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000004] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000001] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +3.502021] systemd-fstab-generator[550]: Ignoring "noauto" for root device
	[  +0.080598] systemd-fstab-generator[561]: Ignoring "noauto" for root device
	[  +4.621130] systemd-fstab-generator[784]: Ignoring "noauto" for root device
	[  +1.389487] kauditd_printk_skb: 16 callbacks suppressed
	[  +0.211553] systemd-fstab-generator[946]: Ignoring "noauto" for root device
	[  +0.083763] systemd-fstab-generator[957]: Ignoring "noauto" for root device
	[  +0.095050] systemd-fstab-generator[968]: Ignoring "noauto" for root device
	[  +1.348467] systemd-fstab-generator[1118]: Ignoring "noauto" for root device
	[  +0.091839] systemd-fstab-generator[1129]: Ignoring "noauto" for root device
	[  +3.080266] systemd-fstab-generator[1343]: Ignoring "noauto" for root device
	[  +0.541704] kauditd_printk_skb: 68 callbacks suppressed
	[Nov28 19:45] systemd-fstab-generator[1999]: Ignoring "noauto" for root device
	[ +13.258785] kauditd_printk_skb: 8 callbacks suppressed
	[  +6.823620] kauditd_printk_skb: 20 callbacks suppressed
	[  +0.847816] systemd-fstab-generator[2971]: Ignoring "noauto" for root device
	[  +0.183187] systemd-fstab-generator[3040]: Ignoring "noauto" for root device
	[  +0.140719] systemd-fstab-generator[3051]: Ignoring "noauto" for root device
	[  +7.553990] systemd-fstab-generator[4373]: Ignoring "noauto" for root device
	[  +0.096660] systemd-fstab-generator[4384]: Ignoring "noauto" for root device
	[  +3.112545] kauditd_printk_skb: 31 callbacks suppressed
	[  +2.227867] systemd-fstab-generator[5154]: Ignoring "noauto" for root device
	
	* 
	* ==> etcd [8ca995856c18] <==
	* {"level":"info","ts":"2022-11-28T19:45:47.772Z","caller":"etcdserver/server.go:851","msg":"starting etcd server","local-member-id":"9d8eb8badbf65f53","local-server-version":"3.5.4","cluster-version":"to_be_decided"}
	{"level":"info","ts":"2022-11-28T19:45:47.773Z","caller":"etcdserver/server.go:752","msg":"starting initial election tick advance","election-ticks":10}
	{"level":"info","ts":"2022-11-28T19:45:47.773Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"9d8eb8badbf65f53 switched to configuration voters=(11353214823341383507)"}
	{"level":"info","ts":"2022-11-28T19:45:47.773Z","caller":"membership/cluster.go:421","msg":"added member","cluster-id":"148f46ea0e83aed3","local-member-id":"9d8eb8badbf65f53","added-peer-id":"9d8eb8badbf65f53","added-peer-peer-urls":["https://192.168.64.71:2380"]}
	{"level":"info","ts":"2022-11-28T19:45:47.774Z","caller":"membership/cluster.go:584","msg":"set initial cluster version","cluster-id":"148f46ea0e83aed3","local-member-id":"9d8eb8badbf65f53","cluster-version":"3.5"}
	{"level":"info","ts":"2022-11-28T19:45:47.774Z","caller":"api/capability.go:75","msg":"enabled capabilities for version","cluster-version":"3.5"}
	{"level":"info","ts":"2022-11-28T19:45:47.774Z","caller":"embed/etcd.go:688","msg":"starting with client TLS","tls-info":"cert = /var/lib/minikube/certs/etcd/server.crt, key = /var/lib/minikube/certs/etcd/server.key, client-cert=, client-key=, trusted-ca = /var/lib/minikube/certs/etcd/ca.crt, client-cert-auth = true, crl-file = ","cipher-suites":[]}
	{"level":"info","ts":"2022-11-28T19:45:47.774Z","caller":"embed/etcd.go:277","msg":"now serving peer/client/metrics","local-member-id":"9d8eb8badbf65f53","initial-advertise-peer-urls":["https://192.168.64.71:2380"],"listen-peer-urls":["https://192.168.64.71:2380"],"advertise-client-urls":["https://192.168.64.71:2379"],"listen-client-urls":["https://127.0.0.1:2379","https://192.168.64.71:2379"],"listen-metrics-urls":["http://127.0.0.1:2381"]}
	{"level":"info","ts":"2022-11-28T19:45:47.774Z","caller":"embed/etcd.go:763","msg":"serving metrics","address":"http://127.0.0.1:2381"}
	{"level":"info","ts":"2022-11-28T19:45:47.775Z","caller":"embed/etcd.go:581","msg":"serving peer traffic","address":"192.168.64.71:2380"}
	{"level":"info","ts":"2022-11-28T19:45:47.775Z","caller":"embed/etcd.go:553","msg":"cmux::serve","address":"192.168.64.71:2380"}
	{"level":"info","ts":"2022-11-28T19:45:49.565Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"9d8eb8badbf65f53 is starting a new election at term 3"}
	{"level":"info","ts":"2022-11-28T19:45:49.565Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"9d8eb8badbf65f53 became pre-candidate at term 3"}
	{"level":"info","ts":"2022-11-28T19:45:49.565Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"9d8eb8badbf65f53 received MsgPreVoteResp from 9d8eb8badbf65f53 at term 3"}
	{"level":"info","ts":"2022-11-28T19:45:49.565Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"9d8eb8badbf65f53 became candidate at term 4"}
	{"level":"info","ts":"2022-11-28T19:45:49.565Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"9d8eb8badbf65f53 received MsgVoteResp from 9d8eb8badbf65f53 at term 4"}
	{"level":"info","ts":"2022-11-28T19:45:49.565Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"9d8eb8badbf65f53 became leader at term 4"}
	{"level":"info","ts":"2022-11-28T19:45:49.565Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: 9d8eb8badbf65f53 elected leader 9d8eb8badbf65f53 at term 4"}
	{"level":"info","ts":"2022-11-28T19:45:49.570Z","caller":"etcdserver/server.go:2042","msg":"published local member to cluster through raft","local-member-id":"9d8eb8badbf65f53","local-member-attributes":"{Name:pause-114428 ClientURLs:[https://192.168.64.71:2379]}","request-path":"/0/members/9d8eb8badbf65f53/attributes","cluster-id":"148f46ea0e83aed3","publish-timeout":"7s"}
	{"level":"info","ts":"2022-11-28T19:45:49.570Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
	{"level":"info","ts":"2022-11-28T19:45:49.571Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2022-11-28T19:45:49.572Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
	{"level":"info","ts":"2022-11-28T19:45:49.572Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"192.168.64.71:2379"}
	{"level":"info","ts":"2022-11-28T19:45:49.583Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2022-11-28T19:45:49.583Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	
	* 
	* ==> etcd [cf3c8b0b6388] <==
	* {"level":"info","ts":"2022-11-28T19:45:35.804Z","caller":"embed/etcd.go:581","msg":"serving peer traffic","address":"192.168.64.71:2380"}
	{"level":"info","ts":"2022-11-28T19:45:35.804Z","caller":"embed/etcd.go:553","msg":"cmux::serve","address":"192.168.64.71:2380"}
	{"level":"info","ts":"2022-11-28T19:45:36.794Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"9d8eb8badbf65f53 is starting a new election at term 2"}
	{"level":"info","ts":"2022-11-28T19:45:36.794Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"9d8eb8badbf65f53 became pre-candidate at term 2"}
	{"level":"info","ts":"2022-11-28T19:45:36.794Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"9d8eb8badbf65f53 received MsgPreVoteResp from 9d8eb8badbf65f53 at term 2"}
	{"level":"info","ts":"2022-11-28T19:45:36.794Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"9d8eb8badbf65f53 became candidate at term 3"}
	{"level":"info","ts":"2022-11-28T19:45:36.794Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"9d8eb8badbf65f53 received MsgVoteResp from 9d8eb8badbf65f53 at term 3"}
	{"level":"info","ts":"2022-11-28T19:45:36.794Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"9d8eb8badbf65f53 became leader at term 3"}
	{"level":"info","ts":"2022-11-28T19:45:36.794Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: 9d8eb8badbf65f53 elected leader 9d8eb8badbf65f53 at term 3"}
	{"level":"info","ts":"2022-11-28T19:45:36.795Z","caller":"etcdserver/server.go:2042","msg":"published local member to cluster through raft","local-member-id":"9d8eb8badbf65f53","local-member-attributes":"{Name:pause-114428 ClientURLs:[https://192.168.64.71:2379]}","request-path":"/0/members/9d8eb8badbf65f53/attributes","cluster-id":"148f46ea0e83aed3","publish-timeout":"7s"}
	{"level":"info","ts":"2022-11-28T19:45:36.795Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
	{"level":"info","ts":"2022-11-28T19:45:36.796Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"192.168.64.71:2379"}
	{"level":"info","ts":"2022-11-28T19:45:36.796Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2022-11-28T19:45:36.796Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"info","ts":"2022-11-28T19:45:36.796Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
	{"level":"info","ts":"2022-11-28T19:45:36.798Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"127.0.0.1:2379"}
	WARNING: 2022/11/28 19:45:40 [core] grpc: Server.processUnaryRPC failed to write status connection error: desc = "transport is closing"
	{"level":"info","ts":"2022-11-28T19:45:40.086Z","caller":"osutil/interrupt_unix.go:64","msg":"received signal; shutting down","signal":"terminated"}
	{"level":"info","ts":"2022-11-28T19:45:40.086Z","caller":"embed/etcd.go:368","msg":"closing etcd server","name":"pause-114428","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.64.71:2380"],"advertise-client-urls":["https://192.168.64.71:2379"]}
	WARNING: 2022/11/28 19:45:40 [core] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 127.0.0.1:2379 <nil> 0 <nil>}. Err: connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
	WARNING: 2022/11/28 19:45:40 [core] grpc: addrConn.createTransport failed to connect to {192.168.64.71:2379 192.168.64.71:2379 <nil> 0 <nil>}. Err: connection error: desc = "transport: Error while dialing dial tcp 192.168.64.71:2379: connect: connection refused". Reconnecting...
	{"level":"info","ts":"2022-11-28T19:45:40.140Z","caller":"etcdserver/server.go:1453","msg":"skipped leadership transfer for single voting member cluster","local-member-id":"9d8eb8badbf65f53","current-leader-member-id":"9d8eb8badbf65f53"}
	{"level":"info","ts":"2022-11-28T19:45:40.141Z","caller":"embed/etcd.go:563","msg":"stopping serving peer traffic","address":"192.168.64.71:2380"}
	{"level":"info","ts":"2022-11-28T19:45:40.143Z","caller":"embed/etcd.go:568","msg":"stopped serving peer traffic","address":"192.168.64.71:2380"}
	{"level":"info","ts":"2022-11-28T19:45:40.143Z","caller":"embed/etcd.go:370","msg":"closed etcd server","name":"pause-114428","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.64.71:2380"],"advertise-client-urls":["https://192.168.64.71:2379"]}
	
	* 
	* ==> kernel <==
	*  19:46:12 up 1 min,  0 users,  load average: 0.84, 0.32, 0.12
	Linux pause-114428 5.10.57 #1 SMP Thu Nov 17 20:18:45 UTC 2022 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2021.02.12"
	
	* 
	* ==> kube-apiserver [9ead6b6b65f5] <==
	* I1128 19:45:51.391536       1 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	I1128 19:45:51.419748       1 controller.go:85] Starting OpenAPI controller
	I1128 19:45:51.420165       1 controller.go:85] Starting OpenAPI V3 controller
	I1128 19:45:51.420240       1 naming_controller.go:291] Starting NamingConditionController
	I1128 19:45:51.420375       1 establishing_controller.go:76] Starting EstablishingController
	I1128 19:45:51.420487       1 nonstructuralschema_controller.go:192] Starting NonStructuralSchemaConditionController
	I1128 19:45:51.420544       1 apiapproval_controller.go:186] Starting KubernetesAPIApprovalPolicyConformantConditionController
	I1128 19:45:51.420636       1 crd_finalizer.go:266] Starting CRDFinalizer
	I1128 19:45:51.495238       1 shared_informer.go:262] Caches are synced for crd-autoregister
	I1128 19:45:51.497265       1 cache.go:39] Caches are synced for APIServiceRegistrationController controller
	I1128 19:45:51.501641       1 cache.go:39] Caches are synced for autoregister controller
	I1128 19:45:51.501852       1 shared_informer.go:262] Caches are synced for cluster_authentication_trust_controller
	I1128 19:45:51.502356       1 apf_controller.go:305] Running API Priority and Fairness config worker
	I1128 19:45:51.515859       1 shared_informer.go:262] Caches are synced for node_authorizer
	I1128 19:45:51.529117       1 cache.go:39] Caches are synced for AvailableConditionController controller
	I1128 19:45:51.558519       1 controller.go:616] quota admission added evaluator for: leases.coordination.k8s.io
	I1128 19:45:52.202910       1 controller.go:132] OpenAPI AggregationController: action for item k8s_internal_local_delegation_chain_0000000000: Nothing (removed from the queue).
	I1128 19:45:52.402628       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I1128 19:45:53.049729       1 controller.go:616] quota admission added evaluator for: serviceaccounts
	I1128 19:45:53.057998       1 controller.go:616] quota admission added evaluator for: deployments.apps
	I1128 19:45:53.080027       1 controller.go:616] quota admission added evaluator for: daemonsets.apps
	I1128 19:45:53.095389       1 controller.go:616] quota admission added evaluator for: roles.rbac.authorization.k8s.io
	I1128 19:45:53.099831       1 controller.go:616] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I1128 19:46:03.782524       1 controller.go:616] quota admission added evaluator for: endpoints
	I1128 19:46:03.785336       1 controller.go:616] quota admission added evaluator for: endpointslices.discovery.k8s.io
	
	* 
	* ==> kube-apiserver [cb7ec3000bb9] <==
	* 
	* 
	* ==> kube-controller-manager [2f771b319f03] <==
	* I1128 19:46:03.803704       1 shared_informer.go:262] Caches are synced for namespace
	I1128 19:46:03.803906       1 shared_informer.go:262] Caches are synced for attach detach
	I1128 19:46:03.805390       1 shared_informer.go:262] Caches are synced for PVC protection
	I1128 19:46:03.805588       1 shared_informer.go:262] Caches are synced for TTL
	I1128 19:46:03.807494       1 shared_informer.go:262] Caches are synced for TTL after finished
	I1128 19:46:03.810908       1 shared_informer.go:262] Caches are synced for PV protection
	I1128 19:46:03.811087       1 shared_informer.go:262] Caches are synced for disruption
	I1128 19:46:03.813835       1 shared_informer.go:262] Caches are synced for taint
	I1128 19:46:03.814002       1 taint_manager.go:204] "Starting NoExecuteTaintManager"
	I1128 19:46:03.814147       1 taint_manager.go:209] "Sending events to api server"
	I1128 19:46:03.814924       1 node_lifecycle_controller.go:1443] Initializing eviction metric for zone: 
	W1128 19:46:03.815014       1 node_lifecycle_controller.go:1058] Missing timestamp for Node pause-114428. Assuming now as a timestamp.
	I1128 19:46:03.815095       1 node_lifecycle_controller.go:1259] Controller detected that zone  is now in state Normal.
	I1128 19:46:03.815185       1 event.go:294] "Event occurred" object="pause-114428" fieldPath="" kind="Node" apiVersion="v1" type="Normal" reason="RegisteredNode" message="Node pause-114428 event: Registered Node pause-114428 in Controller"
	I1128 19:46:03.824085       1 shared_informer.go:262] Caches are synced for stateful set
	I1128 19:46:03.826573       1 shared_informer.go:262] Caches are synced for ephemeral
	I1128 19:46:03.827910       1 shared_informer.go:262] Caches are synced for crt configmap
	I1128 19:46:03.832526       1 shared_informer.go:262] Caches are synced for GC
	I1128 19:46:03.892263       1 shared_informer.go:262] Caches are synced for resource quota
	I1128 19:46:03.915317       1 shared_informer.go:262] Caches are synced for persistent volume
	I1128 19:46:03.915496       1 shared_informer.go:262] Caches are synced for resource quota
	I1128 19:46:03.921779       1 shared_informer.go:262] Caches are synced for HPA
	I1128 19:46:04.348738       1 shared_informer.go:262] Caches are synced for garbage collector
	I1128 19:46:04.378484       1 shared_informer.go:262] Caches are synced for garbage collector
	I1128 19:46:04.378549       1 garbagecollector.go:163] Garbage collector: all resource monitors have synced. Proceeding to collect garbage
	
	* 
	* ==> kube-controller-manager [677c0d9be193] <==
	* I1128 19:45:35.295899       1 serving.go:348] Generated self-signed cert in-memory
	I1128 19:45:35.739194       1 controllermanager.go:178] Version: v1.25.3
	I1128 19:45:35.739272       1 controllermanager.go:180] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I1128 19:45:35.740120       1 secure_serving.go:210] Serving securely on 127.0.0.1:10257
	I1128 19:45:35.740242       1 tlsconfig.go:240] "Starting DynamicServingCertificateController"
	I1128 19:45:35.740397       1 dynamic_cafile_content.go:157] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I1128 19:45:35.740906       1 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	I1128 19:45:37.085997       1 shared_informer.go:255] Waiting for caches to sync for tokens
	F1128 19:45:37.110344       1 client_builder_dynamic.go:138] Get "https://192.168.64.71:8443/api/v1/namespaces/kube-system/serviceaccounts/disruption-controller": dial tcp 192.168.64.71:8443: connect: connection refused
	
	* 
	* ==> kube-proxy [20905b729742] <==
	* I1128 19:45:52.278223       1 node.go:163] Successfully retrieved node IP: 192.168.64.71
	I1128 19:45:52.278294       1 server_others.go:138] "Detected node IP" address="192.168.64.71"
	I1128 19:45:52.278309       1 server_others.go:578] "Unknown proxy mode, assuming iptables proxy" proxyMode=""
	I1128 19:45:52.300428       1 server_others.go:199] "kube-proxy running in single-stack mode, this ipFamily is not supported" ipFamily=IPv6
	I1128 19:45:52.300467       1 server_others.go:206] "Using iptables Proxier"
	I1128 19:45:52.300510       1 proxier.go:262] "Setting route_localnet=1, use nodePortAddresses to filter loopback addresses for NodePorts to skip it https://issues.k8s.io/90259"
	I1128 19:45:52.300719       1 server.go:661] "Version info" version="v1.25.3"
	I1128 19:45:52.300746       1 server.go:663] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I1128 19:45:52.301173       1 config.go:317] "Starting service config controller"
	I1128 19:45:52.301203       1 shared_informer.go:255] Waiting for caches to sync for service config
	I1128 19:45:52.301218       1 config.go:226] "Starting endpoint slice config controller"
	I1128 19:45:52.301222       1 shared_informer.go:255] Waiting for caches to sync for endpoint slice config
	I1128 19:45:52.301572       1 config.go:444] "Starting node config controller"
	I1128 19:45:52.301599       1 shared_informer.go:255] Waiting for caches to sync for node config
	I1128 19:45:52.401733       1 shared_informer.go:262] Caches are synced for node config
	I1128 19:45:52.401817       1 shared_informer.go:262] Caches are synced for service config
	I1128 19:45:52.401834       1 shared_informer.go:262] Caches are synced for endpoint slice config
	
	* 
	* ==> kube-proxy [d3c7e3e4259d] <==
	* I1128 19:45:37.083278       1 node.go:163] Successfully retrieved node IP: 192.168.64.71
	I1128 19:45:37.083358       1 server_others.go:138] "Detected node IP" address="192.168.64.71"
	I1128 19:45:37.083374       1 server_others.go:578] "Unknown proxy mode, assuming iptables proxy" proxyMode=""
	I1128 19:45:37.244908       1 server_others.go:199] "kube-proxy running in single-stack mode, this ipFamily is not supported" ipFamily=IPv6
	I1128 19:45:37.245016       1 server_others.go:206] "Using iptables Proxier"
	I1128 19:45:37.245128       1 proxier.go:262] "Setting route_localnet=1, use nodePortAddresses to filter loopback addresses for NodePorts to skip it https://issues.k8s.io/90259"
	I1128 19:45:37.245414       1 server.go:661] "Version info" version="v1.25.3"
	I1128 19:45:37.245622       1 server.go:663] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I1128 19:45:37.246202       1 config.go:317] "Starting service config controller"
	I1128 19:45:37.246265       1 shared_informer.go:255] Waiting for caches to sync for service config
	I1128 19:45:37.246308       1 config.go:226] "Starting endpoint slice config controller"
	I1128 19:45:37.246385       1 shared_informer.go:255] Waiting for caches to sync for endpoint slice config
	I1128 19:45:37.246970       1 config.go:444] "Starting node config controller"
	I1128 19:45:37.247017       1 shared_informer.go:255] Waiting for caches to sync for node config
	E1128 19:45:37.247437       1 event_broadcaster.go:262] Unable to write event: 'Post "https://control-plane.minikube.internal:8443/apis/events.k8s.io/v1/namespaces/default/events": dial tcp 192.168.64.71:8443: connect: connection refused' (may retry after sleeping)
	W1128 19:45:37.247538       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Node: Get "https://control-plane.minikube.internal:8443/api/v1/nodes?fieldSelector=metadata.name%!D(MISSING)pause-114428&limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	E1128 19:45:37.247603       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://control-plane.minikube.internal:8443/api/v1/nodes?fieldSelector=metadata.name%!D(MISSING)pause-114428&limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	W1128 19:45:37.247675       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.EndpointSlice: Get "https://control-plane.minikube.internal:8443/apis/discovery.k8s.io/v1/endpointslices?labelSelector=%!s(MISSING)ervice.kubernetes.io%!F(MISSING)headless%!C(MISSING)%!s(MISSING)ervice.kubernetes.io%!F(MISSING)service-proxy-name&limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	E1128 19:45:37.247836       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.EndpointSlice: failed to list *v1.EndpointSlice: Get "https://control-plane.minikube.internal:8443/apis/discovery.k8s.io/v1/endpointslices?labelSelector=%!s(MISSING)ervice.kubernetes.io%!F(MISSING)headless%!C(MISSING)%!s(MISSING)ervice.kubernetes.io%!F(MISSING)service-proxy-name&limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	W1128 19:45:37.247942       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Service: Get "https://control-plane.minikube.internal:8443/api/v1/services?labelSelector=%!s(MISSING)ervice.kubernetes.io%!F(MISSING)headless%!C(MISSING)%!s(MISSING)ervice.kubernetes.io%!F(MISSING)service-proxy-name&limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	E1128 19:45:37.248046       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://control-plane.minikube.internal:8443/api/v1/services?labelSelector=%!s(MISSING)ervice.kubernetes.io%!F(MISSING)headless%!C(MISSING)%!s(MISSING)ervice.kubernetes.io%!F(MISSING)service-proxy-name&limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	
	* 
	* ==> kube-scheduler [7cf378f4872e] <==
	* W1128 19:45:37.579193       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Node: Get "https://192.168.64.71:8443/api/v1/nodes?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	E1128 19:45:37.579209       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://192.168.64.71:8443/api/v1/nodes?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	W1128 19:45:37.579255       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.PodDisruptionBudget: Get "https://192.168.64.71:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	E1128 19:45:37.579273       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PodDisruptionBudget: failed to list *v1.PodDisruptionBudget: Get "https://192.168.64.71:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	W1128 19:45:37.579313       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.CSIDriver: Get "https://192.168.64.71:8443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	E1128 19:45:37.579326       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://192.168.64.71:8443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	W1128 19:45:37.579369       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.StorageClass: Get "https://192.168.64.71:8443/apis/storage.k8s.io/v1/storageclasses?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	E1128 19:45:37.579385       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: Get "https://192.168.64.71:8443/apis/storage.k8s.io/v1/storageclasses?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	W1128 19:45:37.579431       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.StatefulSet: Get "https://192.168.64.71:8443/apis/apps/v1/statefulsets?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	E1128 19:45:37.579446       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: Get "https://192.168.64.71:8443/apis/apps/v1/statefulsets?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	W1128 19:45:37.579491       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.CSIStorageCapacity: Get "https://192.168.64.71:8443/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	E1128 19:45:37.579509       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.CSIStorageCapacity: failed to list *v1.CSIStorageCapacity: Get "https://192.168.64.71:8443/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	W1128 19:45:37.579561       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.PersistentVolumeClaim: Get "https://192.168.64.71:8443/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	E1128 19:45:37.579577       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: Get "https://192.168.64.71:8443/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	W1128 19:45:37.579626       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.ReplicaSet: Get "https://192.168.64.71:8443/apis/apps/v1/replicasets?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	E1128 19:45:37.579642       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: Get "https://192.168.64.71:8443/apis/apps/v1/replicasets?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	W1128 19:45:37.579689       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.ReplicationController: Get "https://192.168.64.71:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	E1128 19:45:37.579706       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: Get "https://192.168.64.71:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	W1128 19:45:37.579782       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Pod: Get "https://192.168.64.71:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	E1128 19:45:37.579821       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Pod: failed to list *v1.Pod: Get "https://192.168.64.71:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	I1128 19:45:40.192257       1 secure_serving.go:255] Stopped listening on 127.0.0.1:10259
	I1128 19:45:40.192312       1 tlsconfig.go:255] "Shutting down DynamicServingCertificateController"
	E1128 19:45:40.192372       1 shared_informer.go:258] unable to sync caches for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	I1128 19:45:40.192382       1 configmap_cafile_content.go:210] "Shutting down controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	E1128 19:45:40.192653       1 run.go:74] "command failed" err="finished without leader elect"
	
	* 
	* ==> kube-scheduler [f24fa2c61249] <==
	* I1128 19:45:48.943389       1 serving.go:348] Generated self-signed cert in-memory
	W1128 19:45:51.431124       1 requestheader_controller.go:193] Unable to get configmap/extension-apiserver-authentication in kube-system.  Usually fixed by 'kubectl create rolebinding -n kube-system ROLEBINDING_NAME --role=extension-apiserver-authentication-reader --serviceaccount=YOUR_NS:YOUR_SA'
	W1128 19:45:51.431162       1 authentication.go:346] Error looking up in-cluster authentication configuration: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot get resource "configmaps" in API group "" in the namespace "kube-system"
	W1128 19:45:51.431173       1 authentication.go:347] Continuing without authentication configuration. This may treat all requests as anonymous.
	W1128 19:45:51.431178       1 authentication.go:348] To require authentication configuration lookup to succeed, set --authentication-tolerate-lookup-failure=false
	I1128 19:45:51.489608       1 server.go:148] "Starting Kubernetes Scheduler" version="v1.25.3"
	I1128 19:45:51.489679       1 server.go:150] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I1128 19:45:51.490452       1 secure_serving.go:210] Serving securely on 127.0.0.1:10259
	I1128 19:45:51.490664       1 tlsconfig.go:240] "Starting DynamicServingCertificateController"
	I1128 19:45:51.490685       1 configmap_cafile_content.go:202] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1128 19:45:51.492612       1 shared_informer.go:255] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	I1128 19:45:51.592870       1 shared_informer.go:262] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	* 
	* ==> kubelet <==
	* -- Journal begins at Mon 2022-11-28 19:44:41 UTC, ends at Mon 2022-11-28 19:46:13 UTC. --
	Nov 28 19:45:50 pause-114428 kubelet[5160]: E1128 19:45:50.888347    5160 kubelet.go:2448] "Error getting node" err="node \"pause-114428\" not found"
	Nov 28 19:45:50 pause-114428 kubelet[5160]: E1128 19:45:50.989327    5160 kubelet.go:2448] "Error getting node" err="node \"pause-114428\" not found"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: E1128 19:45:51.090157    5160 kubelet.go:2448] "Error getting node" err="node \"pause-114428\" not found"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: E1128 19:45:51.191131    5160 kubelet.go:2448] "Error getting node" err="node \"pause-114428\" not found"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: E1128 19:45:51.291436    5160 kubelet.go:2448] "Error getting node" err="node \"pause-114428\" not found"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: E1128 19:45:51.392560    5160 kubelet.go:2448] "Error getting node" err="node \"pause-114428\" not found"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: I1128 19:45:51.452540    5160 apiserver.go:52] "Watching apiserver"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: I1128 19:45:51.457340    5160 topology_manager.go:205] "Topology Admit Handler"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: I1128 19:45:51.457522    5160 topology_manager.go:205] "Topology Admit Handler"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: I1128 19:45:51.493456    5160 kuberuntime_manager.go:1050] "Updating runtime config through cri with podcidr" CIDR="10.244.0.0/24"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: I1128 19:45:51.494470    5160 kubelet_network.go:60] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="10.244.0.0/24"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: I1128 19:45:51.572696    5160 kubelet_node_status.go:108] "Node was previously registered" node="pause-114428"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: I1128 19:45:51.572896    5160 kubelet_node_status.go:73] "Successfully registered node" node="pause-114428"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: I1128 19:45:51.587359    5160 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/5f48a047-d53f-4630-beec-4846d0327f1f-kube-proxy\") pod \"kube-proxy-gm2kh\" (UID: \"5f48a047-d53f-4630-beec-4846d0327f1f\") " pod="kube-system/kube-proxy-gm2kh"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: I1128 19:45:51.587403    5160 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/5f48a047-d53f-4630-beec-4846d0327f1f-xtables-lock\") pod \"kube-proxy-gm2kh\" (UID: \"5f48a047-d53f-4630-beec-4846d0327f1f\") " pod="kube-system/kube-proxy-gm2kh"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: I1128 19:45:51.587423    5160 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m896r\" (UniqueName: \"kubernetes.io/projected/5f48a047-d53f-4630-beec-4846d0327f1f-kube-api-access-m896r\") pod \"kube-proxy-gm2kh\" (UID: \"5f48a047-d53f-4630-beec-4846d0327f1f\") " pod="kube-system/kube-proxy-gm2kh"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: I1128 19:45:51.587438    5160 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5f48a047-d53f-4630-beec-4846d0327f1f-lib-modules\") pod \"kube-proxy-gm2kh\" (UID: \"5f48a047-d53f-4630-beec-4846d0327f1f\") " pod="kube-system/kube-proxy-gm2kh"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: I1128 19:45:51.587453    5160 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/21d18ebd-7324-4815-857d-aa2cea270e10-config-volume\") pod \"coredns-565d847f94-hstbw\" (UID: \"21d18ebd-7324-4815-857d-aa2cea270e10\") " pod="kube-system/coredns-565d847f94-hstbw"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: I1128 19:45:51.587467    5160 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mc7k\" (UniqueName: \"kubernetes.io/projected/21d18ebd-7324-4815-857d-aa2cea270e10-kube-api-access-9mc7k\") pod \"coredns-565d847f94-hstbw\" (UID: \"21d18ebd-7324-4815-857d-aa2cea270e10\") " pod="kube-system/coredns-565d847f94-hstbw"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: I1128 19:45:51.587475    5160 reconciler.go:169] "Reconciler: start to sync state"
	Nov 28 19:45:55 pause-114428 kubelet[5160]: I1128 19:45:55.048441    5160 prober_manager.go:287] "Failed to trigger a manual run" probe="Readiness"
	Nov 28 19:46:07 pause-114428 kubelet[5160]: I1128 19:46:07.448454    5160 topology_manager.go:205] "Topology Admit Handler"
	Nov 28 19:46:07 pause-114428 kubelet[5160]: I1128 19:46:07.533903    5160 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xg5f\" (UniqueName: \"kubernetes.io/projected/f3bb3ce4-fe67-4d5b-9a95-048c18b13469-kube-api-access-7xg5f\") pod \"storage-provisioner\" (UID: \"f3bb3ce4-fe67-4d5b-9a95-048c18b13469\") " pod="kube-system/storage-provisioner"
	Nov 28 19:46:07 pause-114428 kubelet[5160]: I1128 19:46:07.534114    5160 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/host-path/f3bb3ce4-fe67-4d5b-9a95-048c18b13469-tmp\") pod \"storage-provisioner\" (UID: \"f3bb3ce4-fe67-4d5b-9a95-048c18b13469\") " pod="kube-system/storage-provisioner"
	Nov 28 19:46:08 pause-114428 kubelet[5160]: I1128 19:46:08.130797    5160 pod_container_deletor.go:79] "Container not found in pod's containers" containerID="abce6d5dca595edd52e4a8355a41885daa2203a0494b505eef197f4b59c6878b"
	
	* 
	* ==> storage-provisioner [ea817622d730] <==
	* I1128 19:46:08.218499       1 storage_provisioner.go:116] Initializing the minikube storage provisioner...
	I1128 19:46:08.227072       1 storage_provisioner.go:141] Storage provisioner initialized, now starting service!
	I1128 19:46:08.227155       1 leaderelection.go:243] attempting to acquire leader lease kube-system/k8s.io-minikube-hostpath...
	I1128 19:46:08.231884       1 leaderelection.go:253] successfully acquired lease kube-system/k8s.io-minikube-hostpath
	I1128 19:46:08.232532       1 controller.go:835] Starting provisioner controller k8s.io/minikube-hostpath_pause-114428_f69256bb-0bd9-410a-9c26-d14dac5e0be7!
	I1128 19:46:08.235299       1 event.go:282] Event(v1.ObjectReference{Kind:"Endpoints", Namespace:"kube-system", Name:"k8s.io-minikube-hostpath", UID:"f4dc94b1-53db-49f8-ac60-7ccaf0a5fc0a", APIVersion:"v1", ResourceVersion:"473", FieldPath:""}): type: 'Normal' reason: 'LeaderElection' pause-114428_f69256bb-0bd9-410a-9c26-d14dac5e0be7 became leader
	I1128 19:46:08.333291       1 controller.go:884] Started provisioner controller k8s.io/minikube-hostpath_pause-114428_f69256bb-0bd9-410a-9c26-d14dac5e0be7!
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p pause-114428 -n pause-114428
helpers_test.go:261: (dbg) Run:  kubectl --context pause-114428 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:270: non-running pods: 
helpers_test.go:272: ======> post-mortem[TestPause/serial/SecondStartNoReconfiguration]: describe non-running pods <======
helpers_test.go:275: (dbg) Run:  kubectl --context pause-114428 describe pod 
helpers_test.go:275: (dbg) Non-zero exit: kubectl --context pause-114428 describe pod : exit status 1 (41.548459ms)

                                                
                                                
** stderr ** 
	error: resource name may not be empty

                                                
                                                
** /stderr **
helpers_test.go:277: kubectl --context pause-114428 describe pod : exit status 1
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p pause-114428 -n pause-114428
helpers_test.go:244: <<< TestPause/serial/SecondStartNoReconfiguration FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestPause/serial/SecondStartNoReconfiguration]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-darwin-amd64 -p pause-114428 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-darwin-amd64 -p pause-114428 logs -n 25: (2.943097938s)
helpers_test.go:252: TestPause/serial/SecondStartNoReconfiguration logs: 
-- stdout --
	* 
	* ==> Audit <==
	* |---------|--------------------------------|---------------------------|---------|---------|---------------------|---------------------|
	| Command |              Args              |          Profile          |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------|---------------------------|---------|---------|---------------------|---------------------|
	| start   | -p kubernetes-upgrade-113819   | kubernetes-upgrade-113819 | jenkins | v1.28.0 | 28 Nov 22 11:39 PST | 28 Nov 22 11:40 PST |
	|         | --memory=2200                  |                           |         |         |                     |                     |
	|         | --kubernetes-version=v1.25.3   |                           |         |         |                     |                     |
	|         | --alsologtostderr -v=1         |                           |         |         |                     |                     |
	|         | --driver=hyperkit              |                           |         |         |                     |                     |
	| start   | -p kubernetes-upgrade-113819   | kubernetes-upgrade-113819 | jenkins | v1.28.0 | 28 Nov 22 11:40 PST |                     |
	|         | --memory=2200                  |                           |         |         |                     |                     |
	|         | --kubernetes-version=v1.16.0   |                           |         |         |                     |                     |
	|         | --driver=hyperkit              |                           |         |         |                     |                     |
	| start   | -p kubernetes-upgrade-113819   | kubernetes-upgrade-113819 | jenkins | v1.28.0 | 28 Nov 22 11:40 PST | 28 Nov 22 11:41 PST |
	|         | --memory=2200                  |                           |         |         |                     |                     |
	|         | --kubernetes-version=v1.25.3   |                           |         |         |                     |                     |
	|         | --alsologtostderr -v=1         |                           |         |         |                     |                     |
	|         | --driver=hyperkit              |                           |         |         |                     |                     |
	| delete  | -p kubernetes-upgrade-113819   | kubernetes-upgrade-113819 | jenkins | v1.28.0 | 28 Nov 22 11:41 PST | 28 Nov 22 11:41 PST |
	| start   | -p cert-expiration-113723      | cert-expiration-113723    | jenkins | v1.28.0 | 28 Nov 22 11:41 PST | 28 Nov 22 11:41 PST |
	|         | --memory=2048                  |                           |         |         |                     |                     |
	|         | --cert-expiration=8760h        |                           |         |         |                     |                     |
	|         | --driver=hyperkit              |                           |         |         |                     |                     |
	| delete  | -p cert-expiration-113723      | cert-expiration-113723    | jenkins | v1.28.0 | 28 Nov 22 11:41 PST | 28 Nov 22 11:41 PST |
	| start   | -p stopped-upgrade-114107      | stopped-upgrade-114107    | jenkins | v1.28.0 | 28 Nov 22 11:43 PST | 28 Nov 22 11:44 PST |
	|         | --memory=2200                  |                           |         |         |                     |                     |
	|         | --alsologtostderr -v=1         |                           |         |         |                     |                     |
	|         | --driver=hyperkit              |                           |         |         |                     |                     |
	| start   | -p running-upgrade-114136      | running-upgrade-114136    | jenkins | v1.28.0 | 28 Nov 22 11:43 PST | 28 Nov 22 11:44 PST |
	|         | --memory=2200                  |                           |         |         |                     |                     |
	|         | --alsologtostderr -v=1         |                           |         |         |                     |                     |
	|         | --driver=hyperkit              |                           |         |         |                     |                     |
	| delete  | -p stopped-upgrade-114107      | stopped-upgrade-114107    | jenkins | v1.28.0 | 28 Nov 22 11:44 PST | 28 Nov 22 11:44 PST |
	| start   | -p NoKubernetes-114420         | NoKubernetes-114420       | jenkins | v1.28.0 | 28 Nov 22 11:44 PST |                     |
	|         | --no-kubernetes                |                           |         |         |                     |                     |
	|         | --kubernetes-version=1.20      |                           |         |         |                     |                     |
	|         | --driver=hyperkit              |                           |         |         |                     |                     |
	| start   | -p NoKubernetes-114420         | NoKubernetes-114420       | jenkins | v1.28.0 | 28 Nov 22 11:44 PST | 28 Nov 22 11:45 PST |
	|         | --driver=hyperkit              |                           |         |         |                     |                     |
	| delete  | -p running-upgrade-114136      | running-upgrade-114136    | jenkins | v1.28.0 | 28 Nov 22 11:44 PST | 28 Nov 22 11:44 PST |
	| start   | -p pause-114428 --memory=2048  | pause-114428              | jenkins | v1.28.0 | 28 Nov 22 11:44 PST | 28 Nov 22 11:45 PST |
	|         | --install-addons=false         |                           |         |         |                     |                     |
	|         | --wait=all --driver=hyperkit   |                           |         |         |                     |                     |
	| start   | -p NoKubernetes-114420         | NoKubernetes-114420       | jenkins | v1.28.0 | 28 Nov 22 11:45 PST | 28 Nov 22 11:45 PST |
	|         | --no-kubernetes                |                           |         |         |                     |                     |
	|         | --driver=hyperkit              |                           |         |         |                     |                     |
	| delete  | -p NoKubernetes-114420         | NoKubernetes-114420       | jenkins | v1.28.0 | 28 Nov 22 11:45 PST | 28 Nov 22 11:45 PST |
	| start   | -p NoKubernetes-114420         | NoKubernetes-114420       | jenkins | v1.28.0 | 28 Nov 22 11:45 PST | 28 Nov 22 11:45 PST |
	|         | --no-kubernetes                |                           |         |         |                     |                     |
	|         | --driver=hyperkit              |                           |         |         |                     |                     |
	| start   | -p pause-114428                | pause-114428              | jenkins | v1.28.0 | 28 Nov 22 11:45 PST | 28 Nov 22 11:46 PST |
	|         | --alsologtostderr -v=1         |                           |         |         |                     |                     |
	|         | --driver=hyperkit              |                           |         |         |                     |                     |
	| ssh     | -p NoKubernetes-114420 sudo    | NoKubernetes-114420       | jenkins | v1.28.0 | 28 Nov 22 11:45 PST |                     |
	|         | systemctl is-active --quiet    |                           |         |         |                     |                     |
	|         | service kubelet                |                           |         |         |                     |                     |
	| profile | list                           | minikube                  | jenkins | v1.28.0 | 28 Nov 22 11:45 PST | 28 Nov 22 11:45 PST |
	| profile | list --output=json             | minikube                  | jenkins | v1.28.0 | 28 Nov 22 11:45 PST | 28 Nov 22 11:45 PST |
	| stop    | -p NoKubernetes-114420         | NoKubernetes-114420       | jenkins | v1.28.0 | 28 Nov 22 11:45 PST | 28 Nov 22 11:45 PST |
	| start   | -p NoKubernetes-114420         | NoKubernetes-114420       | jenkins | v1.28.0 | 28 Nov 22 11:45 PST | 28 Nov 22 11:45 PST |
	|         | --driver=hyperkit              |                           |         |         |                     |                     |
	| ssh     | -p NoKubernetes-114420 sudo    | NoKubernetes-114420       | jenkins | v1.28.0 | 28 Nov 22 11:45 PST |                     |
	|         | systemctl is-active --quiet    |                           |         |         |                     |                     |
	|         | service kubelet                |                           |         |         |                     |                     |
	| delete  | -p NoKubernetes-114420         | NoKubernetes-114420       | jenkins | v1.28.0 | 28 Nov 22 11:45 PST | 28 Nov 22 11:45 PST |
	| start   | -p auto-113537 --memory=2048   | auto-113537               | jenkins | v1.28.0 | 28 Nov 22 11:45 PST |                     |
	|         | --alsologtostderr              |                           |         |         |                     |                     |
	|         | --wait=true --wait-timeout=5m  |                           |         |         |                     |                     |
	|         | --driver=hyperkit              |                           |         |         |                     |                     |
	|---------|--------------------------------|---------------------------|---------|---------|---------------------|---------------------|
	
	* 
	* ==> Last Start <==
	* Log file created at: 2022/11/28 11:45:56
	Running on machine: MacOS-Agent-2
	Binary: Built with gc go1.19.3 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1128 11:45:56.335795   23695 out.go:296] Setting OutFile to fd 1 ...
	I1128 11:45:56.335982   23695 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1128 11:45:56.335987   23695 out.go:309] Setting ErrFile to fd 2...
	I1128 11:45:56.335991   23695 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1128 11:45:56.336102   23695 root.go:334] Updating PATH: /Users/jenkins/minikube-integration/15411-14646/.minikube/bin
	I1128 11:45:56.336655   23695 out.go:303] Setting JSON to false
	I1128 11:45:56.355529   23695 start.go:124] hostinfo: {"hostname":"MacOS-Agent-2.local","uptime":9931,"bootTime":1669654825,"procs":398,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"13.0.1","kernelVersion":"22.1.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"2965c349-98a5-5970-aaa9-9eedd3ae5959"}
	W1128 11:45:56.355630   23695 start.go:132] gopshost.Virtualization returned error: not implemented yet
	I1128 11:45:56.377508   23695 out.go:177] * [auto-113537] minikube v1.28.0 on Darwin 13.0.1
	I1128 11:45:56.399253   23695 notify.go:220] Checking for updates...
	I1128 11:45:56.399295   23695 out.go:177]   - MINIKUBE_LOCATION=15411
	I1128 11:45:56.421341   23695 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/15411-14646/kubeconfig
	I1128 11:45:56.443238   23695 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I1128 11:45:56.464325   23695 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1128 11:45:56.507037   23695 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/15411-14646/.minikube
	I1128 11:45:56.544554   23695 config.go:180] Loaded profile config "pause-114428": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.3
	I1128 11:45:56.544601   23695 driver.go:365] Setting default libvirt URI to qemu:///system
	I1128 11:45:56.573415   23695 out.go:177] * Using the hyperkit driver based on user configuration
	I1128 11:45:56.615268   23695 start.go:293] selected driver: hyperkit
	I1128 11:45:56.615328   23695 start.go:837] validating driver "hyperkit" against <nil>
	I1128 11:45:56.615366   23695 start.go:848] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1128 11:45:56.619138   23695 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1128 11:45:56.619304   23695 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/15411-14646/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I1128 11:45:56.626281   23695 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.28.0
	I1128 11:45:56.629580   23695 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:45:56.629598   23695 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I1128 11:45:56.629636   23695 start_flags.go:303] no existing cluster config was found, will generate one from the flags 
	I1128 11:45:56.629793   23695 start_flags.go:910] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1128 11:45:56.629820   23695 cni.go:95] Creating CNI manager for ""
	I1128 11:45:56.629829   23695 cni.go:169] CNI unnecessary in this configuration, recommending no CNI
	I1128 11:45:56.629837   23695 start_flags.go:317] config:
	{Name:auto-113537 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.36-1668787669-15272@sha256:06094fc04b5dc02fbf1e2de7723c2a6db5d24c21fd2ddda91f6daaf29038cd9c Memory:2048 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.25.3 ClusterName:auto-113537 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRI
Socket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:5m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath:/opt/socket_vmnet/bin/socket_vmnet_client SocketVMnetPath:/var/run/socket_vmnet}
	I1128 11:45:56.629944   23695 iso.go:125] acquiring lock: {Name:mkf8786ebc65c7c4a918cffd312ffffda2a4bd0b Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1128 11:45:56.672004   23695 out.go:177] * Starting control plane node auto-113537 in cluster auto-113537
	I1128 11:45:56.711339   23695 preload.go:132] Checking if preload exists for k8s version v1.25.3 and runtime docker
	I1128 11:45:56.711457   23695 preload.go:148] Found local preload: /Users/jenkins/minikube-integration/15411-14646/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.25.3-docker-overlay2-amd64.tar.lz4
	I1128 11:45:56.711496   23695 cache.go:57] Caching tarball of preloaded images
	I1128 11:45:56.711745   23695 preload.go:174] Found /Users/jenkins/minikube-integration/15411-14646/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.25.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I1128 11:45:56.711773   23695 cache.go:60] Finished verifying existence of preloaded tar for  v1.25.3 on docker
	I1128 11:45:56.711928   23695 profile.go:148] Saving config to /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/auto-113537/config.json ...
	I1128 11:45:56.711982   23695 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/auto-113537/config.json: {Name:mk7f26573ec22434b9a6aa9a5cdee227059dc03f Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1128 11:45:56.712562   23695 cache.go:208] Successfully downloaded all kic artifacts
	I1128 11:45:56.712614   23695 start.go:364] acquiring machines lock for auto-113537: {Name:mk027eaad0dbb84f6e95336dab244cf2d7aaac44 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I1128 11:45:56.712714   23695 start.go:368] acquired machines lock for "auto-113537" in 85.681µs
	I1128 11:45:56.712758   23695 start.go:93] Provisioning new machine with config: &{Name:auto-113537 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/15235/minikube-v1.28.0-1668700269-15235-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.36-1668787669-15272@sha256:06094fc04b5dc02fbf1e2de7723c2a6db5d24c21fd2ddda91f6daaf29038cd9c Memory:2048 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConf
ig:{KubernetesVersion:v1.25.3 ClusterName:auto-113537 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.25.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:5m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions
:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath:/opt/socket_vmnet/bin/socket_vmnet_client SocketVMnetPath:/var/run/socket_vmnet} &{Name: IP: Port:8443 KubernetesVersion:v1.25.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I1128 11:45:56.712867   23695 start.go:125] createHost starting for "" (driver="hyperkit")
	I1128 11:45:55.030954   23619 pod_ready.go:102] pod "coredns-565d847f94-hstbw" in "kube-system" namespace has status "Ready":"False"
	I1128 11:45:55.531849   23619 pod_ready.go:92] pod "coredns-565d847f94-hstbw" in "kube-system" namespace has status "Ready":"True"
	I1128 11:45:55.531861   23619 pod_ready.go:81] duration metric: took 2.508666905s waiting for pod "coredns-565d847f94-hstbw" in "kube-system" namespace to be "Ready" ...
	I1128 11:45:55.531868   23619 pod_ready.go:78] waiting up to 4m0s for pod "etcd-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:45:57.538943   23619 pod_ready.go:102] pod "etcd-pause-114428" in "kube-system" namespace has status "Ready":"False"
	I1128 11:45:58.038339   23619 pod_ready.go:92] pod "etcd-pause-114428" in "kube-system" namespace has status "Ready":"True"
	I1128 11:45:58.038353   23619 pod_ready.go:81] duration metric: took 2.506427283s waiting for pod "etcd-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:45:58.038359   23619 pod_ready.go:78] waiting up to 4m0s for pod "kube-apiserver-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:45:56.750239   23695 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	I1128 11:45:56.750744   23695 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:45:56.750817   23695 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 11:45:56.758738   23695 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58736
	I1128 11:45:56.759133   23695 main.go:134] libmachine: () Calling .GetVersion
	I1128 11:45:56.759558   23695 main.go:134] libmachine: Using API Version  1
	I1128 11:45:56.759569   23695 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 11:45:56.759778   23695 main.go:134] libmachine: () Calling .GetMachineName
	I1128 11:45:56.759865   23695 main.go:134] libmachine: (auto-113537) Calling .GetMachineName
	I1128 11:45:56.759947   23695 main.go:134] libmachine: (auto-113537) Calling .DriverName
	I1128 11:45:56.760046   23695 start.go:159] libmachine.API.Create for "auto-113537" (driver="hyperkit")
	I1128 11:45:56.760071   23695 client.go:168] LocalClient.Create starting
	I1128 11:45:56.760114   23695 main.go:134] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/15411-14646/.minikube/certs/ca.pem
	I1128 11:45:56.760161   23695 main.go:134] libmachine: Decoding PEM data...
	I1128 11:45:56.760177   23695 main.go:134] libmachine: Parsing certificate...
	I1128 11:45:56.760239   23695 main.go:134] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/15411-14646/.minikube/certs/cert.pem
	I1128 11:45:56.760271   23695 main.go:134] libmachine: Decoding PEM data...
	I1128 11:45:56.760283   23695 main.go:134] libmachine: Parsing certificate...
	I1128 11:45:56.760299   23695 main.go:134] libmachine: Running pre-create checks...
	I1128 11:45:56.760306   23695 main.go:134] libmachine: (auto-113537) Calling .PreCreateCheck
	I1128 11:45:56.760378   23695 main.go:134] libmachine: (auto-113537) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:45:56.760573   23695 main.go:134] libmachine: (auto-113537) Calling .GetConfigRaw
	I1128 11:45:56.761021   23695 main.go:134] libmachine: Creating machine...
	I1128 11:45:56.761030   23695 main.go:134] libmachine: (auto-113537) Calling .Create
	I1128 11:45:56.761106   23695 main.go:134] libmachine: (auto-113537) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:45:56.761231   23695 main.go:134] libmachine: (auto-113537) DBG | I1128 11:45:56.761096   23703 common.go:116] Making disk image using store path: /Users/jenkins/minikube-integration/15411-14646/.minikube
	I1128 11:45:56.761285   23695 main.go:134] libmachine: (auto-113537) Downloading /Users/jenkins/minikube-integration/15411-14646/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/15411-14646/.minikube/cache/iso/amd64/minikube-v1.28.0-1668700269-15235-amd64.iso...
	I1128 11:45:56.911471   23695 main.go:134] libmachine: (auto-113537) DBG | I1128 11:45:56.911404   23703 common.go:123] Creating ssh key: /Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/id_rsa...
	I1128 11:45:57.006986   23695 main.go:134] libmachine: (auto-113537) DBG | I1128 11:45:57.006900   23703 common.go:129] Creating raw disk image: /Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/auto-113537.rawdisk...
	I1128 11:45:57.007008   23695 main.go:134] libmachine: (auto-113537) DBG | Writing magic tar header
	I1128 11:45:57.007017   23695 main.go:134] libmachine: (auto-113537) DBG | Writing SSH key tar header
	I1128 11:45:57.007972   23695 main.go:134] libmachine: (auto-113537) DBG | I1128 11:45:57.007927   23703 common.go:143] Fixing permissions on /Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537 ...
	I1128 11:45:57.155343   23695 main.go:134] libmachine: (auto-113537) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:45:57.155360   23695 main.go:134] libmachine: (auto-113537) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/hyperkit.pid
	I1128 11:45:57.155400   23695 main.go:134] libmachine: (auto-113537) DBG | Using UUID 46b4ad8a-6f55-11ed-a6ea-f01898ef957c
	I1128 11:45:57.183655   23695 main.go:134] libmachine: (auto-113537) DBG | Generated MAC b6:f5:64:15:38:f8
	I1128 11:45:57.183679   23695 main.go:134] libmachine: (auto-113537) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=auto-113537
	I1128 11:45:57.183717   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"46b4ad8a-6f55-11ed-a6ea-f01898ef957c", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc000206ed0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/bzimage", Initrd:"/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I1128 11:45:57.183757   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"46b4ad8a-6f55-11ed-a6ea-f01898ef957c", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc000206ed0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/bzimage", Initrd:"/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I1128 11:45:57.183813   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/hyperkit.pid", "-c", "2", "-m", "2048M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "46b4ad8a-6f55-11ed-a6ea-f01898ef957c", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/auto-113537.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/tty,log=/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/bzimage,/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-11
3537/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=auto-113537"}
	I1128 11:45:57.183862   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/hyperkit.pid -c 2 -m 2048M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 46b4ad8a-6f55-11ed-a6ea-f01898ef957c -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/auto-113537.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/tty,log=/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/console-ring -f kexec,/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/bzimage,/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/initrd,earlyprintk=serial loglevel=3 console=ttyS0 consol
e=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=auto-113537"
	I1128 11:45:57.183878   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I1128 11:45:57.185452   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 DEBUG: hyperkit: Pid is 23706
	I1128 11:45:57.185821   23695 main.go:134] libmachine: (auto-113537) DBG | Attempt 0
	I1128 11:45:57.185840   23695 main.go:134] libmachine: (auto-113537) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:45:57.185931   23695 main.go:134] libmachine: (auto-113537) DBG | hyperkit pid from json: 23706
	I1128 11:45:57.187669   23695 main.go:134] libmachine: (auto-113537) DBG | Searching for b6:f5:64:15:38:f8 in /var/db/dhcpd_leases ...
	I1128 11:45:57.187786   23695 main.go:134] libmachine: (auto-113537) DBG | Found 71 entries in /var/db/dhcpd_leases!
	I1128 11:45:57.187797   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.72 HWAddress:52:8e:82:73:2d:5d ID:1,52:8e:82:73:2d:5d Lease:0x63850ff3}
	I1128 11:45:57.187821   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.71 HWAddress:82:25:bb:13:22:fb ID:1,82:25:bb:13:22:fb Lease:0x63866129}
	I1128 11:45:57.187828   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.70 HWAddress:fa:3e:bb:ec:c:fa ID:1,fa:3e:bb:ec:c:fa Lease:0x63850fce}
	I1128 11:45:57.187835   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.69 HWAddress:1a:4b:c7:db:c5:27 ID:1,1a:4b:c7:db:c5:27 Lease:0x63866083}
	I1128 11:45:57.187845   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.68 HWAddress:6a:d5:2a:9d:e3:ec ID:1,6a:d5:2a:9d:e3:ec Lease:0x638660e0}
	I1128 11:45:57.187855   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.67 HWAddress:d2:b6:13:e9:ec:fb ID:1,d2:b6:13:e9:ec:fb Lease:0x63850ed1}
	I1128 11:45:57.187866   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.66 HWAddress:4e:65:48:b0:42:f7 ID:1,4e:65:48:b0:42:f7 Lease:0x63865f87}
	I1128 11:45:57.187874   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.65 HWAddress:a:c0:d2:86:d1:de ID:1,a:c0:d2:86:d1:de Lease:0x63865f7c}
	I1128 11:45:57.187881   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.64 HWAddress:a2:40:8b:72:14:1e ID:1,a2:40:8b:72:14:1e Lease:0x63850dfd}
	I1128 11:45:57.187901   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.63 HWAddress:e6:49:9d:db:e8:47 ID:1,e6:49:9d:db:e8:47 Lease:0x63850df2}
	I1128 11:45:57.187919   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.62 HWAddress:32:8b:8b:9c:c:60 ID:1,32:8b:8b:9c:c:60 Lease:0x63850dc6}
	I1128 11:45:57.187934   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.61 HWAddress:36:fb:4d:2c:8:50 ID:1,36:fb:4d:2c:8:50 Lease:0x63865f12}
	I1128 11:45:57.187950   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.60 HWAddress:76:3a:56:4a:cd:30 ID:1,76:3a:56:4a:cd:30 Lease:0x63865ec4}
	I1128 11:45:57.187961   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.59 HWAddress:72:47:b5:ba:26:2d ID:1,72:47:b5:ba:26:2d Lease:0x63850d1d}
	I1128 11:45:57.187970   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.58 HWAddress:c6:fa:2c:41:9:47 ID:1,c6:fa:2c:41:9:47 Lease:0x63865dc0}
	I1128 11:45:57.187980   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.57 HWAddress:9e:67:b5:77:d3:1d ID:1,9e:67:b5:77:d3:1d Lease:0x63865d8c}
	I1128 11:45:57.187991   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.56 HWAddress:86:cc:24:51:6b:6e ID:1,86:cc:24:51:6b:6e Lease:0x638509d0}
	I1128 11:45:57.188002   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.55 HWAddress:42:36:92:b8:85:fa ID:1,42:36:92:b8:85:fa Lease:0x63850c35}
	I1128 11:45:57.188012   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.54 HWAddress:f2:f1:f4:12:c2:7c ID:1,f2:f1:f4:12:c2:7c Lease:0x63850c33}
	I1128 11:45:57.188020   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.53 HWAddress:12:89:36:a9:8d:3d ID:1,12:89:36:a9:8d:3d Lease:0x6385057f}
	I1128 11:45:57.188028   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.52 HWAddress:b6:93:42:e7:68:e2 ID:1,b6:93:42:e7:68:e2 Lease:0x63850569}
	I1128 11:45:57.188037   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.51 HWAddress:52:78:bf:d2:77:fd ID:1,52:78:bf:d2:77:fd Lease:0x63850540}
	I1128 11:45:57.188056   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.50 HWAddress:8a:c3:5c:7e:b1:4d ID:1,8a:c3:5c:7e:b1:4d Lease:0x63850544}
	I1128 11:45:57.188071   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.49 HWAddress:b6:44:15:6e:57:97 ID:1,b6:44:15:6e:57:97 Lease:0x63865622}
	I1128 11:45:57.188080   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.48 HWAddress:f2:23:ae:44:4:29 ID:1,f2:23:ae:44:4:29 Lease:0x638655a2}
	I1128 11:45:57.188090   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.47 HWAddress:16:4b:33:ee:29:c1 ID:1,16:4b:33:ee:29:c1 Lease:0x6386549d}
	I1128 11:45:57.188098   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.46 HWAddress:62:45:cd:91:58:e8 ID:1,62:45:cd:91:58:e8 Lease:0x63865465}
	I1128 11:45:57.188105   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.45 HWAddress:f2:6:3c:74:fc:db ID:1,f2:6:3c:74:fc:db Lease:0x638502db}
	I1128 11:45:57.188120   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.44 HWAddress:2a:90:7d:b1:f7:11 ID:1,2a:90:7d:b1:f7:11 Lease:0x6386517b}
	I1128 11:45:57.188134   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.43 HWAddress:4e:57:a2:36:3:67 ID:1,4e:57:a2:36:3:67 Lease:0x63865046}
	I1128 11:45:57.188154   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.42 HWAddress:26:2f:63:2b:d4:a5 ID:1,26:2f:63:2b:d4:a5 Lease:0x63864fe8}
	I1128 11:45:57.188173   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.41 HWAddress:5a:14:d7:4:7:2 ID:1,5a:14:d7:4:7:2 Lease:0x63864e46}
	I1128 11:45:57.188189   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.40 HWAddress:fa:d7:14:9c:bb:c0 ID:1,fa:d7:14:9c:bb:c0 Lease:0x63864e20}
	I1128 11:45:57.188213   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.39 HWAddress:f2:29:bb:45:2e:e2 ID:1,f2:29:bb:45:2e:e2 Lease:0x63864d0d}
	I1128 11:45:57.188230   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.38 HWAddress:ca:69:15:23:d2:a5 ID:1,ca:69:15:23:d2:a5 Lease:0x63864cc3}
	I1128 11:45:57.188243   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.37 HWAddress:5e:51:f3:a2:6:5f ID:1,5e:51:f3:a2:6:5f Lease:0x6384fbed}
	I1128 11:45:57.188252   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.36 HWAddress:aa:58:3a:d5:65:33 ID:1,aa:58:3a:d5:65:33 Lease:0x63864c67}
	I1128 11:45:57.188260   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.35 HWAddress:b2:f2:ea:1f:74:e ID:1,b2:f2:ea:1f:74:e Lease:0x6384fade}
	I1128 11:45:57.188272   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.34 HWAddress:36:f:68:69:52:3d ID:1,36:f:68:69:52:3d Lease:0x63864bca}
	I1128 11:45:57.188283   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.33 HWAddress:e:f3:df:c3:c2:b9 ID:1,e:f3:df:c3:c2:b9 Lease:0x6384fb39}
	I1128 11:45:57.188295   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.32 HWAddress:e2:67:c6:ab:e7:ea ID:1,e2:67:c6:ab:e7:ea Lease:0x6384fa41}
	I1128 11:45:57.188306   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.31 HWAddress:1e:a5:be:ca:98:56 ID:1,1e:a5:be:ca:98:56 Lease:0x63864b0c}
	I1128 11:45:57.188315   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.30 HWAddress:5a:ae:ab:90:bf:2b ID:1,5a:ae:ab:90:bf:2b Lease:0x6384f9c9}
	I1128 11:45:57.188324   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.29 HWAddress:1e:25:1a:be:8d:fd ID:1,1e:25:1a:be:8d:fd Lease:0x6384f977}
	I1128 11:45:57.188332   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.28 HWAddress:e6:9a:10:9a:9d:57 ID:1,e6:9a:10:9a:9d:57 Lease:0x6384f94d}
	I1128 11:45:57.188352   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.27 HWAddress:62:93:5e:a6:9d:67 ID:1,62:93:5e:a6:9d:67 Lease:0x63864a75}
	I1128 11:45:57.188365   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.26 HWAddress:b6:e3:fd:79:5c:f2 ID:1,b6:e3:fd:79:5c:f2 Lease:0x63864a69}
	I1128 11:45:57.188375   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.25 HWAddress:ae:1d:54:5d:a8:a9 ID:1,ae:1d:54:5d:a8:a9 Lease:0x6384f8eb}
	I1128 11:45:57.188383   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.24 HWAddress:ee:75:ae:dd:5:e ID:1,ee:75:ae:dd:5:e Lease:0x6386493d}
	I1128 11:45:57.188401   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.23 HWAddress:f6:ed:f4:c4:fa:ab ID:1,f6:ed:f4:c4:fa:ab Lease:0x63864903}
	I1128 11:45:57.188422   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.22 HWAddress:26:24:eb:25:b4:5e ID:1,26:24:eb:25:b4:5e Lease:0x638648e0}
	I1128 11:45:57.188436   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.21 HWAddress:96:d3:21:59:8c:6e ID:1,96:d3:21:59:8c:6e Lease:0x638648d2}
	I1128 11:45:57.188445   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.20 HWAddress:d6:50:28:3:34:52 ID:1,d6:50:28:3:34:52 Lease:0x6384f751}
	I1128 11:45:57.188453   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.19 HWAddress:b2:8c:dc:88:c8:6f ID:1,b2:8c:dc:88:c8:6f Lease:0x6386487f}
	I1128 11:45:57.188464   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:4a:6:df:57:8c:60 ID:1,4a:6:df:57:8c:60 Lease:0x6386486f}
	I1128 11:45:57.188473   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:ca:dd:8e:48:25:9b ID:1,ca:dd:8e:48:25:9b Lease:0x6384f6e4}
	I1128 11:45:57.188489   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:3e:ab:e9:af:c5:1d ID:1,3e:ab:e9:af:c5:1d Lease:0x6384f67b}
	I1128 11:45:57.188501   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:4e:29:d6:1a:69:c6 ID:1,4e:29:d6:1a:69:c6 Lease:0x63864724}
	I1128 11:45:57.188508   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:42:bd:4c:e7:cd:a6 ID:1,42:bd:4c:e7:cd:a6 Lease:0x6385001b}
	I1128 11:45:57.188517   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:4e:39:2d:ba:b9:1f ID:1,4e:39:2d:ba:b9:1f Lease:0x6384f355}
	I1128 11:45:57.188524   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:aa:18:b6:fc:5:66 ID:1,aa:18:b6:fc:5:66 Lease:0x6384f59a}
	I1128 11:45:57.188533   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:a2:0:cf:d4:a4:1e ID:1,a2:0:cf:d4:a4:1e Lease:0x6384f598}
	I1128 11:45:57.188540   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:a6:b4:46:e:86:bb ID:1,a6:b4:46:e:86:bb Lease:0x6384ee86}
	I1128 11:45:57.188548   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:ca:e9:48:5e:17:79 ID:1,ca:e9:48:5e:17:79 Lease:0x6384ee70}
	I1128 11:45:57.188556   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:aa:9e:45:37:85:ba ID:1,aa:9e:45:37:85:ba Lease:0x6384ee4a}
	I1128 11:45:57.188564   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:a:8d:65:60:cd:d8 ID:1,a:8d:65:60:cd:d8 Lease:0x63863f7b}
	I1128 11:45:57.188572   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:c6:57:55:bb:25:56 ID:1,c6:57:55:bb:25:56 Lease:0x63863f36}
	I1128 11:45:57.188580   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:6:81:84:1b:41:e8 ID:1,6:81:84:1b:41:e8 Lease:0x63863ec5}
	I1128 11:45:57.188601   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:46:e9:d4:c0:3a:d3 ID:1,46:e9:d4:c0:3a:d3 Lease:0x63863dcb}
	I1128 11:45:57.188610   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:5e:da:9c:e4:19:67 ID:1,5e:da:9c:e4:19:67 Lease:0x6384ec41}
	I1128 11:45:57.188618   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:fe:7d:42:ad:6c:69 ID:1,fe:7d:42:ad:6c:69 Lease:0x6384ec0f}
	I1128 11:45:57.191973   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I1128 11:45:57.198964   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I1128 11:45:57.199627   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I1128 11:45:57.199645   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I1128 11:45:57.199675   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I1128 11:45:57.199689   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I1128 11:45:57.547255   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I1128 11:45:57.547271   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I1128 11:45:57.651305   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I1128 11:45:57.651326   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I1128 11:45:57.651345   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I1128 11:45:57.651361   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I1128 11:45:57.652178   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I1128 11:45:57.652186   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:45:57 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I1128 11:45:59.189654   23695 main.go:134] libmachine: (auto-113537) DBG | Attempt 1
	I1128 11:45:59.189670   23695 main.go:134] libmachine: (auto-113537) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:45:59.189730   23695 main.go:134] libmachine: (auto-113537) DBG | hyperkit pid from json: 23706
	I1128 11:45:59.190510   23695 main.go:134] libmachine: (auto-113537) DBG | Searching for b6:f5:64:15:38:f8 in /var/db/dhcpd_leases ...
	I1128 11:45:59.190633   23695 main.go:134] libmachine: (auto-113537) DBG | Found 71 entries in /var/db/dhcpd_leases!
	I1128 11:45:59.190642   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.72 HWAddress:52:8e:82:73:2d:5d ID:1,52:8e:82:73:2d:5d Lease:0x63850ff3}
	I1128 11:45:59.190652   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.71 HWAddress:82:25:bb:13:22:fb ID:1,82:25:bb:13:22:fb Lease:0x63866129}
	I1128 11:45:59.190658   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.70 HWAddress:fa:3e:bb:ec:c:fa ID:1,fa:3e:bb:ec:c:fa Lease:0x63850fce}
	I1128 11:45:59.190665   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.69 HWAddress:1a:4b:c7:db:c5:27 ID:1,1a:4b:c7:db:c5:27 Lease:0x63866083}
	I1128 11:45:59.190675   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.68 HWAddress:6a:d5:2a:9d:e3:ec ID:1,6a:d5:2a:9d:e3:ec Lease:0x638660e0}
	I1128 11:45:59.190682   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.67 HWAddress:d2:b6:13:e9:ec:fb ID:1,d2:b6:13:e9:ec:fb Lease:0x63850ed1}
	I1128 11:45:59.190690   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.66 HWAddress:4e:65:48:b0:42:f7 ID:1,4e:65:48:b0:42:f7 Lease:0x63865f87}
	I1128 11:45:59.190708   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.65 HWAddress:a:c0:d2:86:d1:de ID:1,a:c0:d2:86:d1:de Lease:0x63865f7c}
	I1128 11:45:59.190719   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.64 HWAddress:a2:40:8b:72:14:1e ID:1,a2:40:8b:72:14:1e Lease:0x63850dfd}
	I1128 11:45:59.190741   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.63 HWAddress:e6:49:9d:db:e8:47 ID:1,e6:49:9d:db:e8:47 Lease:0x63850df2}
	I1128 11:45:59.190752   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.62 HWAddress:32:8b:8b:9c:c:60 ID:1,32:8b:8b:9c:c:60 Lease:0x63850dc6}
	I1128 11:45:59.190759   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.61 HWAddress:36:fb:4d:2c:8:50 ID:1,36:fb:4d:2c:8:50 Lease:0x63865f12}
	I1128 11:45:59.190765   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.60 HWAddress:76:3a:56:4a:cd:30 ID:1,76:3a:56:4a:cd:30 Lease:0x63865ec4}
	I1128 11:45:59.190775   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.59 HWAddress:72:47:b5:ba:26:2d ID:1,72:47:b5:ba:26:2d Lease:0x63850d1d}
	I1128 11:45:59.190782   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.58 HWAddress:c6:fa:2c:41:9:47 ID:1,c6:fa:2c:41:9:47 Lease:0x63865dc0}
	I1128 11:45:59.190794   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.57 HWAddress:9e:67:b5:77:d3:1d ID:1,9e:67:b5:77:d3:1d Lease:0x63865d8c}
	I1128 11:45:59.190801   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.56 HWAddress:86:cc:24:51:6b:6e ID:1,86:cc:24:51:6b:6e Lease:0x638509d0}
	I1128 11:45:59.190811   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.55 HWAddress:42:36:92:b8:85:fa ID:1,42:36:92:b8:85:fa Lease:0x63850c35}
	I1128 11:45:59.190818   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.54 HWAddress:f2:f1:f4:12:c2:7c ID:1,f2:f1:f4:12:c2:7c Lease:0x63850c33}
	I1128 11:45:59.190826   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.53 HWAddress:12:89:36:a9:8d:3d ID:1,12:89:36:a9:8d:3d Lease:0x6385057f}
	I1128 11:45:59.190839   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.52 HWAddress:b6:93:42:e7:68:e2 ID:1,b6:93:42:e7:68:e2 Lease:0x63850569}
	I1128 11:45:59.190859   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.51 HWAddress:52:78:bf:d2:77:fd ID:1,52:78:bf:d2:77:fd Lease:0x63850540}
	I1128 11:45:59.190872   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.50 HWAddress:8a:c3:5c:7e:b1:4d ID:1,8a:c3:5c:7e:b1:4d Lease:0x63850544}
	I1128 11:45:59.190880   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.49 HWAddress:b6:44:15:6e:57:97 ID:1,b6:44:15:6e:57:97 Lease:0x63865622}
	I1128 11:45:59.190888   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.48 HWAddress:f2:23:ae:44:4:29 ID:1,f2:23:ae:44:4:29 Lease:0x638655a2}
	I1128 11:45:59.190898   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.47 HWAddress:16:4b:33:ee:29:c1 ID:1,16:4b:33:ee:29:c1 Lease:0x6386549d}
	I1128 11:45:59.190906   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.46 HWAddress:62:45:cd:91:58:e8 ID:1,62:45:cd:91:58:e8 Lease:0x63865465}
	I1128 11:45:59.190914   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.45 HWAddress:f2:6:3c:74:fc:db ID:1,f2:6:3c:74:fc:db Lease:0x638502db}
	I1128 11:45:59.190920   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.44 HWAddress:2a:90:7d:b1:f7:11 ID:1,2a:90:7d:b1:f7:11 Lease:0x6386517b}
	I1128 11:45:59.190927   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.43 HWAddress:4e:57:a2:36:3:67 ID:1,4e:57:a2:36:3:67 Lease:0x63865046}
	I1128 11:45:59.190937   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.42 HWAddress:26:2f:63:2b:d4:a5 ID:1,26:2f:63:2b:d4:a5 Lease:0x63864fe8}
	I1128 11:45:59.190944   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.41 HWAddress:5a:14:d7:4:7:2 ID:1,5a:14:d7:4:7:2 Lease:0x63864e46}
	I1128 11:45:59.190953   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.40 HWAddress:fa:d7:14:9c:bb:c0 ID:1,fa:d7:14:9c:bb:c0 Lease:0x63864e20}
	I1128 11:45:59.190960   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.39 HWAddress:f2:29:bb:45:2e:e2 ID:1,f2:29:bb:45:2e:e2 Lease:0x63864d0d}
	I1128 11:45:59.190969   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.38 HWAddress:ca:69:15:23:d2:a5 ID:1,ca:69:15:23:d2:a5 Lease:0x63864cc3}
	I1128 11:45:59.190976   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.37 HWAddress:5e:51:f3:a2:6:5f ID:1,5e:51:f3:a2:6:5f Lease:0x6384fbed}
	I1128 11:45:59.190984   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.36 HWAddress:aa:58:3a:d5:65:33 ID:1,aa:58:3a:d5:65:33 Lease:0x63864c67}
	I1128 11:45:59.190992   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.35 HWAddress:b2:f2:ea:1f:74:e ID:1,b2:f2:ea:1f:74:e Lease:0x6384fade}
	I1128 11:45:59.191003   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.34 HWAddress:36:f:68:69:52:3d ID:1,36:f:68:69:52:3d Lease:0x63864bca}
	I1128 11:45:59.191012   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.33 HWAddress:e:f3:df:c3:c2:b9 ID:1,e:f3:df:c3:c2:b9 Lease:0x6384fb39}
	I1128 11:45:59.191021   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.32 HWAddress:e2:67:c6:ab:e7:ea ID:1,e2:67:c6:ab:e7:ea Lease:0x6384fa41}
	I1128 11:45:59.191029   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.31 HWAddress:1e:a5:be:ca:98:56 ID:1,1e:a5:be:ca:98:56 Lease:0x63864b0c}
	I1128 11:45:59.191036   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.30 HWAddress:5a:ae:ab:90:bf:2b ID:1,5a:ae:ab:90:bf:2b Lease:0x6384f9c9}
	I1128 11:45:59.191051   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.29 HWAddress:1e:25:1a:be:8d:fd ID:1,1e:25:1a:be:8d:fd Lease:0x6384f977}
	I1128 11:45:59.191066   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.28 HWAddress:e6:9a:10:9a:9d:57 ID:1,e6:9a:10:9a:9d:57 Lease:0x6384f94d}
	I1128 11:45:59.191075   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.27 HWAddress:62:93:5e:a6:9d:67 ID:1,62:93:5e:a6:9d:67 Lease:0x63864a75}
	I1128 11:45:59.191083   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.26 HWAddress:b6:e3:fd:79:5c:f2 ID:1,b6:e3:fd:79:5c:f2 Lease:0x63864a69}
	I1128 11:45:59.191091   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.25 HWAddress:ae:1d:54:5d:a8:a9 ID:1,ae:1d:54:5d:a8:a9 Lease:0x6384f8eb}
	I1128 11:45:59.191098   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.24 HWAddress:ee:75:ae:dd:5:e ID:1,ee:75:ae:dd:5:e Lease:0x6386493d}
	I1128 11:45:59.191107   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.23 HWAddress:f6:ed:f4:c4:fa:ab ID:1,f6:ed:f4:c4:fa:ab Lease:0x63864903}
	I1128 11:45:59.191115   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.22 HWAddress:26:24:eb:25:b4:5e ID:1,26:24:eb:25:b4:5e Lease:0x638648e0}
	I1128 11:45:59.191126   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.21 HWAddress:96:d3:21:59:8c:6e ID:1,96:d3:21:59:8c:6e Lease:0x638648d2}
	I1128 11:45:59.191133   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.20 HWAddress:d6:50:28:3:34:52 ID:1,d6:50:28:3:34:52 Lease:0x6384f751}
	I1128 11:45:59.191140   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.19 HWAddress:b2:8c:dc:88:c8:6f ID:1,b2:8c:dc:88:c8:6f Lease:0x6386487f}
	I1128 11:45:59.191148   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:4a:6:df:57:8c:60 ID:1,4a:6:df:57:8c:60 Lease:0x6386486f}
	I1128 11:45:59.191156   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:ca:dd:8e:48:25:9b ID:1,ca:dd:8e:48:25:9b Lease:0x6384f6e4}
	I1128 11:45:59.191163   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:3e:ab:e9:af:c5:1d ID:1,3e:ab:e9:af:c5:1d Lease:0x6384f67b}
	I1128 11:45:59.191173   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:4e:29:d6:1a:69:c6 ID:1,4e:29:d6:1a:69:c6 Lease:0x63864724}
	I1128 11:45:59.191183   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:42:bd:4c:e7:cd:a6 ID:1,42:bd:4c:e7:cd:a6 Lease:0x6385001b}
	I1128 11:45:59.191195   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:4e:39:2d:ba:b9:1f ID:1,4e:39:2d:ba:b9:1f Lease:0x6384f355}
	I1128 11:45:59.191203   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:aa:18:b6:fc:5:66 ID:1,aa:18:b6:fc:5:66 Lease:0x6384f59a}
	I1128 11:45:59.191212   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:a2:0:cf:d4:a4:1e ID:1,a2:0:cf:d4:a4:1e Lease:0x6384f598}
	I1128 11:45:59.191219   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:a6:b4:46:e:86:bb ID:1,a6:b4:46:e:86:bb Lease:0x6384ee86}
	I1128 11:45:59.191228   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:ca:e9:48:5e:17:79 ID:1,ca:e9:48:5e:17:79 Lease:0x6384ee70}
	I1128 11:45:59.191253   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:aa:9e:45:37:85:ba ID:1,aa:9e:45:37:85:ba Lease:0x6384ee4a}
	I1128 11:45:59.191284   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:a:8d:65:60:cd:d8 ID:1,a:8d:65:60:cd:d8 Lease:0x63863f7b}
	I1128 11:45:59.191292   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:c6:57:55:bb:25:56 ID:1,c6:57:55:bb:25:56 Lease:0x63863f36}
	I1128 11:45:59.191305   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:6:81:84:1b:41:e8 ID:1,6:81:84:1b:41:e8 Lease:0x63863ec5}
	I1128 11:45:59.191314   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:46:e9:d4:c0:3a:d3 ID:1,46:e9:d4:c0:3a:d3 Lease:0x63863dcb}
	I1128 11:45:59.191325   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:5e:da:9c:e4:19:67 ID:1,5e:da:9c:e4:19:67 Lease:0x6384ec41}
	I1128 11:45:59.191335   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:fe:7d:42:ad:6c:69 ID:1,fe:7d:42:ad:6c:69 Lease:0x6384ec0f}
	I1128 11:46:01.191376   23695 main.go:134] libmachine: (auto-113537) DBG | Attempt 2
	I1128 11:46:01.191406   23695 main.go:134] libmachine: (auto-113537) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:46:01.191464   23695 main.go:134] libmachine: (auto-113537) DBG | hyperkit pid from json: 23706
	I1128 11:46:01.192277   23695 main.go:134] libmachine: (auto-113537) DBG | Searching for b6:f5:64:15:38:f8 in /var/db/dhcpd_leases ...
	I1128 11:46:01.192350   23695 main.go:134] libmachine: (auto-113537) DBG | Found 71 entries in /var/db/dhcpd_leases!
	I1128 11:46:01.192361   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.72 HWAddress:52:8e:82:73:2d:5d ID:1,52:8e:82:73:2d:5d Lease:0x63850ff3}
	I1128 11:46:01.192373   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.71 HWAddress:82:25:bb:13:22:fb ID:1,82:25:bb:13:22:fb Lease:0x63866129}
	I1128 11:46:01.192387   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.70 HWAddress:fa:3e:bb:ec:c:fa ID:1,fa:3e:bb:ec:c:fa Lease:0x63850fce}
	I1128 11:46:01.192396   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.69 HWAddress:1a:4b:c7:db:c5:27 ID:1,1a:4b:c7:db:c5:27 Lease:0x63866083}
	I1128 11:46:01.192405   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.68 HWAddress:6a:d5:2a:9d:e3:ec ID:1,6a:d5:2a:9d:e3:ec Lease:0x638660e0}
	I1128 11:46:01.192417   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.67 HWAddress:d2:b6:13:e9:ec:fb ID:1,d2:b6:13:e9:ec:fb Lease:0x63850ed1}
	I1128 11:46:01.192424   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.66 HWAddress:4e:65:48:b0:42:f7 ID:1,4e:65:48:b0:42:f7 Lease:0x63865f87}
	I1128 11:46:01.192431   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.65 HWAddress:a:c0:d2:86:d1:de ID:1,a:c0:d2:86:d1:de Lease:0x63865f7c}
	I1128 11:46:01.192438   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.64 HWAddress:a2:40:8b:72:14:1e ID:1,a2:40:8b:72:14:1e Lease:0x63850dfd}
	I1128 11:46:01.192449   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.63 HWAddress:e6:49:9d:db:e8:47 ID:1,e6:49:9d:db:e8:47 Lease:0x63850df2}
	I1128 11:46:01.192460   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.62 HWAddress:32:8b:8b:9c:c:60 ID:1,32:8b:8b:9c:c:60 Lease:0x63850dc6}
	I1128 11:46:01.192467   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.61 HWAddress:36:fb:4d:2c:8:50 ID:1,36:fb:4d:2c:8:50 Lease:0x63865f12}
	I1128 11:46:01.192474   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.60 HWAddress:76:3a:56:4a:cd:30 ID:1,76:3a:56:4a:cd:30 Lease:0x63865ec4}
	I1128 11:46:01.192481   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.59 HWAddress:72:47:b5:ba:26:2d ID:1,72:47:b5:ba:26:2d Lease:0x63850d1d}
	I1128 11:46:01.192491   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.58 HWAddress:c6:fa:2c:41:9:47 ID:1,c6:fa:2c:41:9:47 Lease:0x63865dc0}
	I1128 11:46:01.192501   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.57 HWAddress:9e:67:b5:77:d3:1d ID:1,9e:67:b5:77:d3:1d Lease:0x63865d8c}
	I1128 11:46:01.192510   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.56 HWAddress:86:cc:24:51:6b:6e ID:1,86:cc:24:51:6b:6e Lease:0x638509d0}
	I1128 11:46:01.192517   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.55 HWAddress:42:36:92:b8:85:fa ID:1,42:36:92:b8:85:fa Lease:0x63850c35}
	I1128 11:46:01.192524   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.54 HWAddress:f2:f1:f4:12:c2:7c ID:1,f2:f1:f4:12:c2:7c Lease:0x63850c33}
	I1128 11:46:01.192554   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.53 HWAddress:12:89:36:a9:8d:3d ID:1,12:89:36:a9:8d:3d Lease:0x6385057f}
	I1128 11:46:01.192568   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.52 HWAddress:b6:93:42:e7:68:e2 ID:1,b6:93:42:e7:68:e2 Lease:0x63850569}
	I1128 11:46:01.192577   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.51 HWAddress:52:78:bf:d2:77:fd ID:1,52:78:bf:d2:77:fd Lease:0x63850540}
	I1128 11:46:01.192584   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.50 HWAddress:8a:c3:5c:7e:b1:4d ID:1,8a:c3:5c:7e:b1:4d Lease:0x63850544}
	I1128 11:46:01.192593   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.49 HWAddress:b6:44:15:6e:57:97 ID:1,b6:44:15:6e:57:97 Lease:0x63865622}
	I1128 11:46:01.192600   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.48 HWAddress:f2:23:ae:44:4:29 ID:1,f2:23:ae:44:4:29 Lease:0x638655a2}
	I1128 11:46:01.192608   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.47 HWAddress:16:4b:33:ee:29:c1 ID:1,16:4b:33:ee:29:c1 Lease:0x6386549d}
	I1128 11:46:01.192619   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.46 HWAddress:62:45:cd:91:58:e8 ID:1,62:45:cd:91:58:e8 Lease:0x63865465}
	I1128 11:46:01.192626   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.45 HWAddress:f2:6:3c:74:fc:db ID:1,f2:6:3c:74:fc:db Lease:0x638502db}
	I1128 11:46:01.192635   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.44 HWAddress:2a:90:7d:b1:f7:11 ID:1,2a:90:7d:b1:f7:11 Lease:0x6386517b}
	I1128 11:46:01.192645   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.43 HWAddress:4e:57:a2:36:3:67 ID:1,4e:57:a2:36:3:67 Lease:0x63865046}
	I1128 11:46:01.192655   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.42 HWAddress:26:2f:63:2b:d4:a5 ID:1,26:2f:63:2b:d4:a5 Lease:0x63864fe8}
	I1128 11:46:01.192663   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.41 HWAddress:5a:14:d7:4:7:2 ID:1,5a:14:d7:4:7:2 Lease:0x63864e46}
	I1128 11:46:01.192670   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.40 HWAddress:fa:d7:14:9c:bb:c0 ID:1,fa:d7:14:9c:bb:c0 Lease:0x63864e20}
	I1128 11:46:01.192686   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.39 HWAddress:f2:29:bb:45:2e:e2 ID:1,f2:29:bb:45:2e:e2 Lease:0x63864d0d}
	I1128 11:46:01.192698   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.38 HWAddress:ca:69:15:23:d2:a5 ID:1,ca:69:15:23:d2:a5 Lease:0x63864cc3}
	I1128 11:46:01.192707   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.37 HWAddress:5e:51:f3:a2:6:5f ID:1,5e:51:f3:a2:6:5f Lease:0x6384fbed}
	I1128 11:46:01.192713   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.36 HWAddress:aa:58:3a:d5:65:33 ID:1,aa:58:3a:d5:65:33 Lease:0x63864c67}
	I1128 11:46:01.192726   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.35 HWAddress:b2:f2:ea:1f:74:e ID:1,b2:f2:ea:1f:74:e Lease:0x6384fade}
	I1128 11:46:01.192741   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.34 HWAddress:36:f:68:69:52:3d ID:1,36:f:68:69:52:3d Lease:0x63864bca}
	I1128 11:46:01.192764   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.33 HWAddress:e:f3:df:c3:c2:b9 ID:1,e:f3:df:c3:c2:b9 Lease:0x6384fb39}
	I1128 11:46:01.192778   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.32 HWAddress:e2:67:c6:ab:e7:ea ID:1,e2:67:c6:ab:e7:ea Lease:0x6384fa41}
	I1128 11:46:01.192789   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.31 HWAddress:1e:a5:be:ca:98:56 ID:1,1e:a5:be:ca:98:56 Lease:0x63864b0c}
	I1128 11:46:01.192797   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.30 HWAddress:5a:ae:ab:90:bf:2b ID:1,5a:ae:ab:90:bf:2b Lease:0x6384f9c9}
	I1128 11:46:01.192812   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.29 HWAddress:1e:25:1a:be:8d:fd ID:1,1e:25:1a:be:8d:fd Lease:0x6384f977}
	I1128 11:46:01.192821   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.28 HWAddress:e6:9a:10:9a:9d:57 ID:1,e6:9a:10:9a:9d:57 Lease:0x6384f94d}
	I1128 11:46:01.192832   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.27 HWAddress:62:93:5e:a6:9d:67 ID:1,62:93:5e:a6:9d:67 Lease:0x63864a75}
	I1128 11:46:01.192841   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.26 HWAddress:b6:e3:fd:79:5c:f2 ID:1,b6:e3:fd:79:5c:f2 Lease:0x63864a69}
	I1128 11:46:01.192848   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.25 HWAddress:ae:1d:54:5d:a8:a9 ID:1,ae:1d:54:5d:a8:a9 Lease:0x6384f8eb}
	I1128 11:46:01.192857   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.24 HWAddress:ee:75:ae:dd:5:e ID:1,ee:75:ae:dd:5:e Lease:0x6386493d}
	I1128 11:46:01.192864   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.23 HWAddress:f6:ed:f4:c4:fa:ab ID:1,f6:ed:f4:c4:fa:ab Lease:0x63864903}
	I1128 11:46:01.192870   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.22 HWAddress:26:24:eb:25:b4:5e ID:1,26:24:eb:25:b4:5e Lease:0x638648e0}
	I1128 11:46:01.192877   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.21 HWAddress:96:d3:21:59:8c:6e ID:1,96:d3:21:59:8c:6e Lease:0x638648d2}
	I1128 11:46:01.192884   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.20 HWAddress:d6:50:28:3:34:52 ID:1,d6:50:28:3:34:52 Lease:0x6384f751}
	I1128 11:46:01.192892   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.19 HWAddress:b2:8c:dc:88:c8:6f ID:1,b2:8c:dc:88:c8:6f Lease:0x6386487f}
	I1128 11:46:01.192899   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:4a:6:df:57:8c:60 ID:1,4a:6:df:57:8c:60 Lease:0x6386486f}
	I1128 11:46:01.192913   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:ca:dd:8e:48:25:9b ID:1,ca:dd:8e:48:25:9b Lease:0x6384f6e4}
	I1128 11:46:01.192921   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:3e:ab:e9:af:c5:1d ID:1,3e:ab:e9:af:c5:1d Lease:0x6384f67b}
	I1128 11:46:01.192929   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:4e:29:d6:1a:69:c6 ID:1,4e:29:d6:1a:69:c6 Lease:0x63864724}
	I1128 11:46:01.192936   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:42:bd:4c:e7:cd:a6 ID:1,42:bd:4c:e7:cd:a6 Lease:0x6385001b}
	I1128 11:46:01.192944   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:4e:39:2d:ba:b9:1f ID:1,4e:39:2d:ba:b9:1f Lease:0x6384f355}
	I1128 11:46:01.192953   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:aa:18:b6:fc:5:66 ID:1,aa:18:b6:fc:5:66 Lease:0x6384f59a}
	I1128 11:46:01.192965   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:a2:0:cf:d4:a4:1e ID:1,a2:0:cf:d4:a4:1e Lease:0x6384f598}
	I1128 11:46:01.192975   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:a6:b4:46:e:86:bb ID:1,a6:b4:46:e:86:bb Lease:0x6384ee86}
	I1128 11:46:01.192991   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:ca:e9:48:5e:17:79 ID:1,ca:e9:48:5e:17:79 Lease:0x6384ee70}
	I1128 11:46:01.193006   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:aa:9e:45:37:85:ba ID:1,aa:9e:45:37:85:ba Lease:0x6384ee4a}
	I1128 11:46:01.193015   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:a:8d:65:60:cd:d8 ID:1,a:8d:65:60:cd:d8 Lease:0x63863f7b}
	I1128 11:46:01.193029   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:c6:57:55:bb:25:56 ID:1,c6:57:55:bb:25:56 Lease:0x63863f36}
	I1128 11:46:01.193040   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:6:81:84:1b:41:e8 ID:1,6:81:84:1b:41:e8 Lease:0x63863ec5}
	I1128 11:46:01.193050   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:46:e9:d4:c0:3a:d3 ID:1,46:e9:d4:c0:3a:d3 Lease:0x63863dcb}
	I1128 11:46:01.193071   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:5e:da:9c:e4:19:67 ID:1,5e:da:9c:e4:19:67 Lease:0x6384ec41}
	I1128 11:46:01.193089   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:fe:7d:42:ad:6c:69 ID:1,fe:7d:42:ad:6c:69 Lease:0x6384ec0f}
	I1128 11:46:00.045962   23619 pod_ready.go:102] pod "kube-apiserver-pause-114428" in "kube-system" namespace has status "Ready":"False"
	I1128 11:46:02.545073   23619 pod_ready.go:102] pod "kube-apiserver-pause-114428" in "kube-system" namespace has status "Ready":"False"
	I1128 11:46:02.074552   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:46:02 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I1128 11:46:02.074660   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:46:02 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I1128 11:46:02.074670   23695 main.go:134] libmachine: (auto-113537) DBG | 2022/11/28 11:46:02 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I1128 11:46:03.194401   23695 main.go:134] libmachine: (auto-113537) DBG | Attempt 3
	I1128 11:46:03.194417   23695 main.go:134] libmachine: (auto-113537) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:46:03.194477   23695 main.go:134] libmachine: (auto-113537) DBG | hyperkit pid from json: 23706
	I1128 11:46:03.195494   23695 main.go:134] libmachine: (auto-113537) DBG | Searching for b6:f5:64:15:38:f8 in /var/db/dhcpd_leases ...
	I1128 11:46:03.195586   23695 main.go:134] libmachine: (auto-113537) DBG | Found 71 entries in /var/db/dhcpd_leases!
	I1128 11:46:03.195595   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.72 HWAddress:52:8e:82:73:2d:5d ID:1,52:8e:82:73:2d:5d Lease:0x63850ff3}
	I1128 11:46:03.195605   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.71 HWAddress:82:25:bb:13:22:fb ID:1,82:25:bb:13:22:fb Lease:0x63866129}
	I1128 11:46:03.195619   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.70 HWAddress:fa:3e:bb:ec:c:fa ID:1,fa:3e:bb:ec:c:fa Lease:0x63850fce}
	I1128 11:46:03.195628   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.69 HWAddress:1a:4b:c7:db:c5:27 ID:1,1a:4b:c7:db:c5:27 Lease:0x63866083}
	I1128 11:46:03.195635   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.68 HWAddress:6a:d5:2a:9d:e3:ec ID:1,6a:d5:2a:9d:e3:ec Lease:0x638660e0}
	I1128 11:46:03.195646   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.67 HWAddress:d2:b6:13:e9:ec:fb ID:1,d2:b6:13:e9:ec:fb Lease:0x63850ed1}
	I1128 11:46:03.195656   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.66 HWAddress:4e:65:48:b0:42:f7 ID:1,4e:65:48:b0:42:f7 Lease:0x63865f87}
	I1128 11:46:03.195662   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.65 HWAddress:a:c0:d2:86:d1:de ID:1,a:c0:d2:86:d1:de Lease:0x63865f7c}
	I1128 11:46:03.195670   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.64 HWAddress:a2:40:8b:72:14:1e ID:1,a2:40:8b:72:14:1e Lease:0x63850dfd}
	I1128 11:46:03.195683   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.63 HWAddress:e6:49:9d:db:e8:47 ID:1,e6:49:9d:db:e8:47 Lease:0x63850df2}
	I1128 11:46:03.195693   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.62 HWAddress:32:8b:8b:9c:c:60 ID:1,32:8b:8b:9c:c:60 Lease:0x63850dc6}
	I1128 11:46:03.195702   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.61 HWAddress:36:fb:4d:2c:8:50 ID:1,36:fb:4d:2c:8:50 Lease:0x63865f12}
	I1128 11:46:03.195710   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.60 HWAddress:76:3a:56:4a:cd:30 ID:1,76:3a:56:4a:cd:30 Lease:0x63865ec4}
	I1128 11:46:03.195716   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.59 HWAddress:72:47:b5:ba:26:2d ID:1,72:47:b5:ba:26:2d Lease:0x63850d1d}
	I1128 11:46:03.195725   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.58 HWAddress:c6:fa:2c:41:9:47 ID:1,c6:fa:2c:41:9:47 Lease:0x63865dc0}
	I1128 11:46:03.195736   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.57 HWAddress:9e:67:b5:77:d3:1d ID:1,9e:67:b5:77:d3:1d Lease:0x63865d8c}
	I1128 11:46:03.195745   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.56 HWAddress:86:cc:24:51:6b:6e ID:1,86:cc:24:51:6b:6e Lease:0x638509d0}
	I1128 11:46:03.195753   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.55 HWAddress:42:36:92:b8:85:fa ID:1,42:36:92:b8:85:fa Lease:0x63850c35}
	I1128 11:46:03.195762   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.54 HWAddress:f2:f1:f4:12:c2:7c ID:1,f2:f1:f4:12:c2:7c Lease:0x63850c33}
	I1128 11:46:03.195771   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.53 HWAddress:12:89:36:a9:8d:3d ID:1,12:89:36:a9:8d:3d Lease:0x6385057f}
	I1128 11:46:03.195788   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.52 HWAddress:b6:93:42:e7:68:e2 ID:1,b6:93:42:e7:68:e2 Lease:0x63850569}
	I1128 11:46:03.195795   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.51 HWAddress:52:78:bf:d2:77:fd ID:1,52:78:bf:d2:77:fd Lease:0x63850540}
	I1128 11:46:03.195803   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.50 HWAddress:8a:c3:5c:7e:b1:4d ID:1,8a:c3:5c:7e:b1:4d Lease:0x63850544}
	I1128 11:46:03.195811   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.49 HWAddress:b6:44:15:6e:57:97 ID:1,b6:44:15:6e:57:97 Lease:0x63865622}
	I1128 11:46:03.195822   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.48 HWAddress:f2:23:ae:44:4:29 ID:1,f2:23:ae:44:4:29 Lease:0x638655a2}
	I1128 11:46:03.195831   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.47 HWAddress:16:4b:33:ee:29:c1 ID:1,16:4b:33:ee:29:c1 Lease:0x6386549d}
	I1128 11:46:03.195840   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.46 HWAddress:62:45:cd:91:58:e8 ID:1,62:45:cd:91:58:e8 Lease:0x63865465}
	I1128 11:46:03.195848   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.45 HWAddress:f2:6:3c:74:fc:db ID:1,f2:6:3c:74:fc:db Lease:0x638502db}
	I1128 11:46:03.195856   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.44 HWAddress:2a:90:7d:b1:f7:11 ID:1,2a:90:7d:b1:f7:11 Lease:0x6386517b}
	I1128 11:46:03.195865   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.43 HWAddress:4e:57:a2:36:3:67 ID:1,4e:57:a2:36:3:67 Lease:0x63865046}
	I1128 11:46:03.195874   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.42 HWAddress:26:2f:63:2b:d4:a5 ID:1,26:2f:63:2b:d4:a5 Lease:0x63864fe8}
	I1128 11:46:03.195881   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.41 HWAddress:5a:14:d7:4:7:2 ID:1,5a:14:d7:4:7:2 Lease:0x63864e46}
	I1128 11:46:03.195889   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.40 HWAddress:fa:d7:14:9c:bb:c0 ID:1,fa:d7:14:9c:bb:c0 Lease:0x63864e20}
	I1128 11:46:03.195897   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.39 HWAddress:f2:29:bb:45:2e:e2 ID:1,f2:29:bb:45:2e:e2 Lease:0x63864d0d}
	I1128 11:46:03.195905   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.38 HWAddress:ca:69:15:23:d2:a5 ID:1,ca:69:15:23:d2:a5 Lease:0x63864cc3}
	I1128 11:46:03.195929   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.37 HWAddress:5e:51:f3:a2:6:5f ID:1,5e:51:f3:a2:6:5f Lease:0x6384fbed}
	I1128 11:46:03.195944   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.36 HWAddress:aa:58:3a:d5:65:33 ID:1,aa:58:3a:d5:65:33 Lease:0x63864c67}
	I1128 11:46:03.195953   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.35 HWAddress:b2:f2:ea:1f:74:e ID:1,b2:f2:ea:1f:74:e Lease:0x6384fade}
	I1128 11:46:03.195962   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.34 HWAddress:36:f:68:69:52:3d ID:1,36:f:68:69:52:3d Lease:0x63864bca}
	I1128 11:46:03.195970   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.33 HWAddress:e:f3:df:c3:c2:b9 ID:1,e:f3:df:c3:c2:b9 Lease:0x6384fb39}
	I1128 11:46:03.195978   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.32 HWAddress:e2:67:c6:ab:e7:ea ID:1,e2:67:c6:ab:e7:ea Lease:0x6384fa41}
	I1128 11:46:03.195986   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.31 HWAddress:1e:a5:be:ca:98:56 ID:1,1e:a5:be:ca:98:56 Lease:0x63864b0c}
	I1128 11:46:03.195994   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.30 HWAddress:5a:ae:ab:90:bf:2b ID:1,5a:ae:ab:90:bf:2b Lease:0x6384f9c9}
	I1128 11:46:03.196001   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.29 HWAddress:1e:25:1a:be:8d:fd ID:1,1e:25:1a:be:8d:fd Lease:0x6384f977}
	I1128 11:46:03.196011   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.28 HWAddress:e6:9a:10:9a:9d:57 ID:1,e6:9a:10:9a:9d:57 Lease:0x6384f94d}
	I1128 11:46:03.196020   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.27 HWAddress:62:93:5e:a6:9d:67 ID:1,62:93:5e:a6:9d:67 Lease:0x63864a75}
	I1128 11:46:03.196028   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.26 HWAddress:b6:e3:fd:79:5c:f2 ID:1,b6:e3:fd:79:5c:f2 Lease:0x63864a69}
	I1128 11:46:03.196038   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.25 HWAddress:ae:1d:54:5d:a8:a9 ID:1,ae:1d:54:5d:a8:a9 Lease:0x6384f8eb}
	I1128 11:46:03.196046   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.24 HWAddress:ee:75:ae:dd:5:e ID:1,ee:75:ae:dd:5:e Lease:0x6386493d}
	I1128 11:46:03.196054   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.23 HWAddress:f6:ed:f4:c4:fa:ab ID:1,f6:ed:f4:c4:fa:ab Lease:0x63864903}
	I1128 11:46:03.196062   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.22 HWAddress:26:24:eb:25:b4:5e ID:1,26:24:eb:25:b4:5e Lease:0x638648e0}
	I1128 11:46:03.196073   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.21 HWAddress:96:d3:21:59:8c:6e ID:1,96:d3:21:59:8c:6e Lease:0x638648d2}
	I1128 11:46:03.196082   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.20 HWAddress:d6:50:28:3:34:52 ID:1,d6:50:28:3:34:52 Lease:0x6384f751}
	I1128 11:46:03.196090   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.19 HWAddress:b2:8c:dc:88:c8:6f ID:1,b2:8c:dc:88:c8:6f Lease:0x6386487f}
	I1128 11:46:03.196098   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:4a:6:df:57:8c:60 ID:1,4a:6:df:57:8c:60 Lease:0x6386486f}
	I1128 11:46:03.196106   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:ca:dd:8e:48:25:9b ID:1,ca:dd:8e:48:25:9b Lease:0x6384f6e4}
	I1128 11:46:03.196114   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:3e:ab:e9:af:c5:1d ID:1,3e:ab:e9:af:c5:1d Lease:0x6384f67b}
	I1128 11:46:03.196124   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:4e:29:d6:1a:69:c6 ID:1,4e:29:d6:1a:69:c6 Lease:0x63864724}
	I1128 11:46:03.196133   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:42:bd:4c:e7:cd:a6 ID:1,42:bd:4c:e7:cd:a6 Lease:0x6385001b}
	I1128 11:46:03.196141   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:4e:39:2d:ba:b9:1f ID:1,4e:39:2d:ba:b9:1f Lease:0x6384f355}
	I1128 11:46:03.196149   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:aa:18:b6:fc:5:66 ID:1,aa:18:b6:fc:5:66 Lease:0x6384f59a}
	I1128 11:46:03.196156   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:a2:0:cf:d4:a4:1e ID:1,a2:0:cf:d4:a4:1e Lease:0x6384f598}
	I1128 11:46:03.196164   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:a6:b4:46:e:86:bb ID:1,a6:b4:46:e:86:bb Lease:0x6384ee86}
	I1128 11:46:03.196171   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:ca:e9:48:5e:17:79 ID:1,ca:e9:48:5e:17:79 Lease:0x6384ee70}
	I1128 11:46:03.196180   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:aa:9e:45:37:85:ba ID:1,aa:9e:45:37:85:ba Lease:0x6384ee4a}
	I1128 11:46:03.196187   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:a:8d:65:60:cd:d8 ID:1,a:8d:65:60:cd:d8 Lease:0x63863f7b}
	I1128 11:46:03.196195   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:c6:57:55:bb:25:56 ID:1,c6:57:55:bb:25:56 Lease:0x63863f36}
	I1128 11:46:03.196208   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:6:81:84:1b:41:e8 ID:1,6:81:84:1b:41:e8 Lease:0x63863ec5}
	I1128 11:46:03.196217   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:46:e9:d4:c0:3a:d3 ID:1,46:e9:d4:c0:3a:d3 Lease:0x63863dcb}
	I1128 11:46:03.196225   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:5e:da:9c:e4:19:67 ID:1,5e:da:9c:e4:19:67 Lease:0x6384ec41}
	I1128 11:46:03.196232   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:fe:7d:42:ad:6c:69 ID:1,fe:7d:42:ad:6c:69 Lease:0x6384ec0f}
	I1128 11:46:05.197528   23695 main.go:134] libmachine: (auto-113537) DBG | Attempt 4
	I1128 11:46:05.197556   23695 main.go:134] libmachine: (auto-113537) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:46:05.197673   23695 main.go:134] libmachine: (auto-113537) DBG | hyperkit pid from json: 23706
	I1128 11:46:05.198552   23695 main.go:134] libmachine: (auto-113537) DBG | Searching for b6:f5:64:15:38:f8 in /var/db/dhcpd_leases ...
	I1128 11:46:05.198728   23695 main.go:134] libmachine: (auto-113537) DBG | Found 72 entries in /var/db/dhcpd_leases!
	I1128 11:46:05.198743   23695 main.go:134] libmachine: (auto-113537) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.73 HWAddress:b6:f5:64:15:38:f8 ID:1,b6:f5:64:15:38:f8 Lease:0x6386617c}
	I1128 11:46:05.198757   23695 main.go:134] libmachine: (auto-113537) DBG | Found match: b6:f5:64:15:38:f8
	I1128 11:46:05.198764   23695 main.go:134] libmachine: (auto-113537) DBG | IP: 192.168.64.73
	I1128 11:46:05.198803   23695 main.go:134] libmachine: (auto-113537) Calling .GetConfigRaw
	I1128 11:46:05.199496   23695 main.go:134] libmachine: (auto-113537) Calling .DriverName
	I1128 11:46:05.199636   23695 main.go:134] libmachine: (auto-113537) Calling .DriverName
	I1128 11:46:05.199763   23695 main.go:134] libmachine: Waiting for machine to be running, this may take a few minutes...
	I1128 11:46:05.199776   23695 main.go:134] libmachine: (auto-113537) Calling .GetState
	I1128 11:46:05.199902   23695 main.go:134] libmachine: (auto-113537) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:46:05.199965   23695 main.go:134] libmachine: (auto-113537) DBG | hyperkit pid from json: 23706
	I1128 11:46:05.200645   23695 main.go:134] libmachine: Detecting operating system of created instance...
	I1128 11:46:05.200656   23695 main.go:134] libmachine: Waiting for SSH to be available...
	I1128 11:46:05.200663   23695 main.go:134] libmachine: Getting to WaitForSSH function...
	I1128 11:46:05.200672   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHHostname
	I1128 11:46:05.200797   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHPort
	I1128 11:46:05.200928   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHKeyPath
	I1128 11:46:05.201043   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHKeyPath
	I1128 11:46:05.201151   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHUsername
	I1128 11:46:05.201306   23695 main.go:134] libmachine: Using SSH client type: native
	I1128 11:46:05.201509   23695 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e7180] 0x13ea300 <nil>  [] 0s} 192.168.64.73 22 <nil> <nil>}
	I1128 11:46:05.201521   23695 main.go:134] libmachine: About to run SSH command:
	exit 0
	I1128 11:46:05.225652   23695 main.go:134] libmachine: Error dialing TCP: ssh: handshake failed: ssh: unable to authenticate, attempted methods [none publickey], no supported methods remain
	I1128 11:46:04.547145   23619 pod_ready.go:102] pod "kube-apiserver-pause-114428" in "kube-system" namespace has status "Ready":"False"
	I1128 11:46:06.048083   23619 pod_ready.go:92] pod "kube-apiserver-pause-114428" in "kube-system" namespace has status "Ready":"True"
	I1128 11:46:06.048097   23619 pod_ready.go:81] duration metric: took 8.009560784s waiting for pod "kube-apiserver-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:06.048103   23619 pod_ready.go:78] waiting up to 4m0s for pod "kube-controller-manager-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:06.051216   23619 pod_ready.go:92] pod "kube-controller-manager-pause-114428" in "kube-system" namespace has status "Ready":"True"
	I1128 11:46:06.051224   23619 pod_ready.go:81] duration metric: took 3.116595ms waiting for pod "kube-controller-manager-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:06.051230   23619 pod_ready.go:78] waiting up to 4m0s for pod "kube-proxy-gm2kh" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:06.053957   23619 pod_ready.go:92] pod "kube-proxy-gm2kh" in "kube-system" namespace has status "Ready":"True"
	I1128 11:46:06.053964   23619 pod_ready.go:81] duration metric: took 2.729921ms waiting for pod "kube-proxy-gm2kh" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:06.053969   23619 pod_ready.go:78] waiting up to 4m0s for pod "kube-scheduler-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:06.562996   23619 pod_ready.go:92] pod "kube-scheduler-pause-114428" in "kube-system" namespace has status "Ready":"True"
	I1128 11:46:06.563009   23619 pod_ready.go:81] duration metric: took 509.018441ms waiting for pod "kube-scheduler-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:06.563015   23619 pod_ready.go:38] duration metric: took 13.542714077s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I1128 11:46:06.563028   23619 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I1128 11:46:06.570229   23619 ops.go:34] apiserver oom_adj: -16
	I1128 11:46:06.570238   23619 kubeadm.go:631] restartCluster took 25.135447778s
	I1128 11:46:06.570243   23619 kubeadm.go:398] StartCluster complete in 25.15757485s
	I1128 11:46:06.570252   23619 settings.go:142] acquiring lock: {Name:mk7392d5e25d999df834aabe5db592398d1f845f Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1128 11:46:06.570337   23619 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/15411-14646/kubeconfig
	I1128 11:46:06.570709   23619 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/15411-14646/kubeconfig: {Name:mk58f8fa3810393ea66460d5cf44fc66020c4987 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1128 11:46:06.571302   23619 kapi.go:59] client config for pause-114428: &rest.Config{Host:"https://192.168.64.71:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/pause-114428/client.crt", KeyFile:"/Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/pause-114428/client.key", CAFile:"/Users/jenkins/minikube-integration/15411-14646/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]
string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x2356c80), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1128 11:46:06.573288   23619 kapi.go:244] deployment "coredns" in namespace "kube-system" and context "pause-114428" rescaled to 1
	I1128 11:46:06.573316   23619 start.go:212] Will wait 6m0s for node &{Name: IP:192.168.64.71 Port:8443 KubernetesVersion:v1.25.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I1128 11:46:06.573325   23619 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.25.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I1128 11:46:06.595199   23619 out.go:177] * Verifying Kubernetes components...
	I1128 11:46:06.573354   23619 addons.go:486] enableAddons start: toEnable=map[], additional=[]
	I1128 11:46:06.573498   23619 config.go:180] Loaded profile config "pause-114428": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.3
	I1128 11:46:06.629592   23619 start.go:806] CoreDNS already contains "host.minikube.internal" host record, skipping...
	I1128 11:46:06.637368   23619 addons.go:65] Setting default-storageclass=true in profile "pause-114428"
	I1128 11:46:06.637368   23619 addons.go:65] Setting storage-provisioner=true in profile "pause-114428"
	I1128 11:46:06.637416   23619 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "pause-114428"
	I1128 11:46:06.637436   23619 addons.go:227] Setting addon storage-provisioner=true in "pause-114428"
	I1128 11:46:06.637382   23619 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	W1128 11:46:06.637448   23619 addons.go:236] addon storage-provisioner should already be in state true
	I1128 11:46:06.637510   23619 host.go:66] Checking if "pause-114428" exists ...
	I1128 11:46:06.637835   23619 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:46:06.637860   23619 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 11:46:06.637905   23619 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:46:06.637925   23619 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 11:46:06.645886   23619 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58739
	I1128 11:46:06.646195   23619 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58741
	I1128 11:46:06.646301   23619 main.go:134] libmachine: () Calling .GetVersion
	I1128 11:46:06.646561   23619 main.go:134] libmachine: () Calling .GetVersion
	I1128 11:46:06.646698   23619 main.go:134] libmachine: Using API Version  1
	I1128 11:46:06.646713   23619 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 11:46:06.646891   23619 main.go:134] libmachine: Using API Version  1
	I1128 11:46:06.646902   23619 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 11:46:06.646912   23619 main.go:134] libmachine: () Calling .GetMachineName
	I1128 11:46:06.647097   23619 main.go:134] libmachine: () Calling .GetMachineName
	I1128 11:46:06.647214   23619 main.go:134] libmachine: (pause-114428) Calling .GetState
	I1128 11:46:06.647287   23619 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:46:06.647304   23619 main.go:134] libmachine: (pause-114428) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:46:06.647310   23619 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 11:46:06.647389   23619 main.go:134] libmachine: (pause-114428) DBG | hyperkit pid from json: 23529
	I1128 11:46:06.648684   23619 node_ready.go:35] waiting up to 6m0s for node "pause-114428" to be "Ready" ...
	I1128 11:46:06.649127   23619 kapi.go:59] client config for pause-114428: &rest.Config{Host:"https://192.168.64.71:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/pause-114428/client.crt", KeyFile:"/Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/pause-114428/client.key", CAFile:"/Users/jenkins/minikube-integration/15411-14646/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]
string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x2356c80), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1128 11:46:06.650857   23619 node_ready.go:49] node "pause-114428" has status "Ready":"True"
	I1128 11:46:06.650866   23619 node_ready.go:38] duration metric: took 2.162932ms waiting for node "pause-114428" to be "Ready" ...
	I1128 11:46:06.650871   23619 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I1128 11:46:06.651663   23619 addons.go:227] Setting addon default-storageclass=true in "pause-114428"
	W1128 11:46:06.651672   23619 addons.go:236] addon default-storageclass should already be in state true
	I1128 11:46:06.651690   23619 host.go:66] Checking if "pause-114428" exists ...
	I1128 11:46:06.651949   23619 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:46:06.651975   23619 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 11:46:06.654611   23619 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58743
	I1128 11:46:06.654971   23619 main.go:134] libmachine: () Calling .GetVersion
	I1128 11:46:06.655291   23619 main.go:134] libmachine: Using API Version  1
	I1128 11:46:06.655304   23619 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 11:46:06.655510   23619 main.go:134] libmachine: () Calling .GetMachineName
	I1128 11:46:06.655602   23619 pod_ready.go:78] waiting up to 6m0s for pod "coredns-565d847f94-hstbw" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:06.655605   23619 main.go:134] libmachine: (pause-114428) Calling .GetState
	I1128 11:46:06.655686   23619 main.go:134] libmachine: (pause-114428) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:46:06.655766   23619 main.go:134] libmachine: (pause-114428) DBG | hyperkit pid from json: 23529
	I1128 11:46:06.656669   23619 main.go:134] libmachine: (pause-114428) Calling .DriverName
	I1128 11:46:06.659108   23619 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58745
	I1128 11:46:06.678307   23619 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1128 11:46:06.679132   23619 main.go:134] libmachine: () Calling .GetVersion
	I1128 11:46:06.699526   23619 addons.go:419] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1128 11:46:06.699546   23619 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1128 11:46:06.699568   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHHostname
	I1128 11:46:06.699800   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHPort
	I1128 11:46:06.700032   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHKeyPath
	I1128 11:46:06.700176   23619 main.go:134] libmachine: Using API Version  1
	I1128 11:46:06.700207   23619 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 11:46:06.700229   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHUsername
	I1128 11:46:06.700424   23619 sshutil.go:53] new ssh client: &{IP:192.168.64.71 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/pause-114428/id_rsa Username:docker}
	I1128 11:46:06.700648   23619 main.go:134] libmachine: () Calling .GetMachineName
	I1128 11:46:06.701359   23619 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:46:06.701397   23619 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 11:46:06.709163   23619 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58748
	I1128 11:46:06.709518   23619 main.go:134] libmachine: () Calling .GetVersion
	I1128 11:46:06.709864   23619 main.go:134] libmachine: Using API Version  1
	I1128 11:46:06.709878   23619 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 11:46:06.710074   23619 main.go:134] libmachine: () Calling .GetMachineName
	I1128 11:46:06.710166   23619 main.go:134] libmachine: (pause-114428) Calling .GetState
	I1128 11:46:06.710244   23619 main.go:134] libmachine: (pause-114428) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:46:06.710318   23619 main.go:134] libmachine: (pause-114428) DBG | hyperkit pid from json: 23529
	I1128 11:46:06.711180   23619 main.go:134] libmachine: (pause-114428) Calling .DriverName
	I1128 11:46:06.711324   23619 addons.go:419] installing /etc/kubernetes/addons/storageclass.yaml
	I1128 11:46:06.711332   23619 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1128 11:46:06.711341   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHHostname
	I1128 11:46:06.711421   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHPort
	I1128 11:46:06.711501   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHKeyPath
	I1128 11:46:06.711570   23619 main.go:134] libmachine: (pause-114428) Calling .GetSSHUsername
	I1128 11:46:06.711656   23619 sshutil.go:53] new ssh client: &{IP:192.168.64.71 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/pause-114428/id_rsa Username:docker}
	I1128 11:46:06.752029   23619 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1128 11:46:06.761901   23619 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.3/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1128 11:46:06.844170   23619 pod_ready.go:92] pod "coredns-565d847f94-hstbw" in "kube-system" namespace has status "Ready":"True"
	I1128 11:46:06.844181   23619 pod_ready.go:81] duration metric: took 188.565794ms waiting for pod "coredns-565d847f94-hstbw" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:06.844189   23619 pod_ready.go:78] waiting up to 6m0s for pod "etcd-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:07.245650   23619 pod_ready.go:92] pod "etcd-pause-114428" in "kube-system" namespace has status "Ready":"True"
	I1128 11:46:07.245661   23619 pod_ready.go:81] duration metric: took 401.459797ms waiting for pod "etcd-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:07.245670   23619 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:07.358355   23619 main.go:134] libmachine: Making call to close driver server
	I1128 11:46:07.358369   23619 main.go:134] libmachine: (pause-114428) Calling .Close
	I1128 11:46:07.358466   23619 main.go:134] libmachine: Making call to close driver server
	I1128 11:46:07.358476   23619 main.go:134] libmachine: (pause-114428) Calling .Close
	I1128 11:46:07.358539   23619 main.go:134] libmachine: Successfully made call to close driver server
	I1128 11:46:07.358554   23619 main.go:134] libmachine: Making call to close connection to plugin binary
	I1128 11:46:07.358561   23619 main.go:134] libmachine: Making call to close driver server
	I1128 11:46:07.358569   23619 main.go:134] libmachine: (pause-114428) Calling .Close
	I1128 11:46:07.358578   23619 main.go:134] libmachine: (pause-114428) DBG | Closing plugin on server side
	I1128 11:46:07.358605   23619 main.go:134] libmachine: Successfully made call to close driver server
	I1128 11:46:07.358613   23619 main.go:134] libmachine: Making call to close connection to plugin binary
	I1128 11:46:07.358622   23619 main.go:134] libmachine: Making call to close driver server
	I1128 11:46:07.358628   23619 main.go:134] libmachine: (pause-114428) Calling .Close
	I1128 11:46:07.358710   23619 main.go:134] libmachine: Successfully made call to close driver server
	I1128 11:46:07.358722   23619 main.go:134] libmachine: Making call to close connection to plugin binary
	I1128 11:46:07.358722   23619 main.go:134] libmachine: (pause-114428) DBG | Closing plugin on server side
	I1128 11:46:07.358738   23619 main.go:134] libmachine: Making call to close driver server
	I1128 11:46:07.358754   23619 main.go:134] libmachine: (pause-114428) Calling .Close
	I1128 11:46:07.358877   23619 main.go:134] libmachine: (pause-114428) DBG | Closing plugin on server side
	I1128 11:46:07.358891   23619 main.go:134] libmachine: Successfully made call to close driver server
	I1128 11:46:07.358932   23619 main.go:134] libmachine: Making call to close connection to plugin binary
	I1128 11:46:07.358981   23619 main.go:134] libmachine: Successfully made call to close driver server
	I1128 11:46:07.359000   23619 main.go:134] libmachine: Making call to close connection to plugin binary
	I1128 11:46:07.359083   23619 main.go:134] libmachine: (pause-114428) DBG | Closing plugin on server side
	I1128 11:46:07.398686   23619 out.go:177] * Enabled addons: storage-provisioner, default-storageclass
	I1128 11:46:07.457752   23619 addons.go:488] enableAddons completed in 884.379177ms
	I1128 11:46:07.646777   23619 pod_ready.go:92] pod "kube-apiserver-pause-114428" in "kube-system" namespace has status "Ready":"True"
	I1128 11:46:07.646790   23619 pod_ready.go:81] duration metric: took 401.1063ms waiting for pod "kube-apiserver-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:07.646798   23619 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:08.043898   23619 pod_ready.go:92] pod "kube-controller-manager-pause-114428" in "kube-system" namespace has status "Ready":"True"
	I1128 11:46:08.043909   23619 pod_ready.go:81] duration metric: took 397.096432ms waiting for pod "kube-controller-manager-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:08.043915   23619 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-gm2kh" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:08.445110   23619 pod_ready.go:92] pod "kube-proxy-gm2kh" in "kube-system" namespace has status "Ready":"True"
	I1128 11:46:08.445121   23619 pod_ready.go:81] duration metric: took 401.192773ms waiting for pod "kube-proxy-gm2kh" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:08.445128   23619 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:08.844356   23619 pod_ready.go:92] pod "kube-scheduler-pause-114428" in "kube-system" namespace has status "Ready":"True"
	I1128 11:46:08.844366   23619 pod_ready.go:81] duration metric: took 399.225652ms waiting for pod "kube-scheduler-pause-114428" in "kube-system" namespace to be "Ready" ...
	I1128 11:46:08.844372   23619 pod_ready.go:38] duration metric: took 2.193441179s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I1128 11:46:08.844385   23619 api_server.go:51] waiting for apiserver process to appear ...
	I1128 11:46:08.844445   23619 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1128 11:46:08.853453   23619 api_server.go:71] duration metric: took 2.280070722s to wait for apiserver process to appear ...
	I1128 11:46:08.853468   23619 api_server.go:87] waiting for apiserver healthz status ...
	I1128 11:46:08.853477   23619 api_server.go:252] Checking apiserver healthz at https://192.168.64.71:8443/healthz ...
	I1128 11:46:08.857396   23619 api_server.go:278] https://192.168.64.71:8443/healthz returned 200:
	ok
	I1128 11:46:08.857918   23619 api_server.go:140] control plane version: v1.25.3
	I1128 11:46:08.857929   23619 api_server.go:130] duration metric: took 4.455033ms to wait for apiserver health ...
	I1128 11:46:08.857934   23619 system_pods.go:43] waiting for kube-system pods to appear ...
	I1128 11:46:09.045441   23619 system_pods.go:59] 7 kube-system pods found
	I1128 11:46:09.045455   23619 system_pods.go:61] "coredns-565d847f94-hstbw" [21d18ebd-7324-4815-857d-aa2cea270e10] Running
	I1128 11:46:09.045459   23619 system_pods.go:61] "etcd-pause-114428" [e76c9996-d2c9-4c3e-bdd1-3b9a162b20d5] Running
	I1128 11:46:09.045463   23619 system_pods.go:61] "kube-apiserver-pause-114428" [ef055aee-e95a-4698-b207-8039ce03eefd] Running
	I1128 11:46:09.045466   23619 system_pods.go:61] "kube-controller-manager-pause-114428" [fb607845-660c-4078-937b-aa7f3841460e] Running
	I1128 11:46:09.045470   23619 system_pods.go:61] "kube-proxy-gm2kh" [5f48a047-d53f-4630-beec-4846d0327f1f] Running
	I1128 11:46:09.045473   23619 system_pods.go:61] "kube-scheduler-pause-114428" [d0a7e59a-952f-43b7-aef1-7f00b12e962a] Running
	I1128 11:46:09.045478   23619 system_pods.go:61] "storage-provisioner" [f3bb3ce4-fe67-4d5b-9a95-048c18b13469] Pending / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I1128 11:46:09.045482   23619 system_pods.go:74] duration metric: took 187.540087ms to wait for pod list to return data ...
	I1128 11:46:09.045488   23619 default_sa.go:34] waiting for default service account to be created ...
	I1128 11:46:09.243309   23619 default_sa.go:45] found service account: "default"
	I1128 11:46:09.243320   23619 default_sa.go:55] duration metric: took 197.823748ms for default service account to be created ...
	I1128 11:46:09.243326   23619 system_pods.go:116] waiting for k8s-apps to be running ...
	I1128 11:46:09.445692   23619 system_pods.go:86] 7 kube-system pods found
	I1128 11:46:09.445706   23619 system_pods.go:89] "coredns-565d847f94-hstbw" [21d18ebd-7324-4815-857d-aa2cea270e10] Running
	I1128 11:46:09.445711   23619 system_pods.go:89] "etcd-pause-114428" [e76c9996-d2c9-4c3e-bdd1-3b9a162b20d5] Running
	I1128 11:46:09.445714   23619 system_pods.go:89] "kube-apiserver-pause-114428" [ef055aee-e95a-4698-b207-8039ce03eefd] Running
	I1128 11:46:09.445718   23619 system_pods.go:89] "kube-controller-manager-pause-114428" [fb607845-660c-4078-937b-aa7f3841460e] Running
	I1128 11:46:09.445721   23619 system_pods.go:89] "kube-proxy-gm2kh" [5f48a047-d53f-4630-beec-4846d0327f1f] Running
	I1128 11:46:09.445725   23619 system_pods.go:89] "kube-scheduler-pause-114428" [d0a7e59a-952f-43b7-aef1-7f00b12e962a] Running
	I1128 11:46:09.445729   23619 system_pods.go:89] "storage-provisioner" [f3bb3ce4-fe67-4d5b-9a95-048c18b13469] Running
	I1128 11:46:09.445733   23619 system_pods.go:126] duration metric: took 202.39991ms to wait for k8s-apps to be running ...
	I1128 11:46:09.445739   23619 system_svc.go:44] waiting for kubelet service to be running ....
	I1128 11:46:09.445802   23619 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1128 11:46:09.455773   23619 system_svc.go:56] duration metric: took 10.02855ms WaitForService to wait for kubelet.
	I1128 11:46:09.455787   23619 kubeadm.go:573] duration metric: took 2.882394245s to wait for : map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] ...
	I1128 11:46:09.455798   23619 node_conditions.go:102] verifying NodePressure condition ...
	I1128 11:46:09.649312   23619 node_conditions.go:122] node storage ephemeral capacity is 17784752Ki
	I1128 11:46:09.649327   23619 node_conditions.go:123] node cpu capacity is 2
	I1128 11:46:09.649353   23619 node_conditions.go:105] duration metric: took 193.543357ms to run NodePressure ...
	I1128 11:46:09.649377   23619 start.go:217] waiting for startup goroutines ...
	I1128 11:46:09.649843   23619 ssh_runner.go:195] Run: rm -f paused
	I1128 11:46:09.693369   23619 start.go:535] kubectl: 1.25.2, cluster: 1.25.3 (minor skew: 0)
	I1128 11:46:09.719383   23619 out.go:177] * Done! kubectl is now configured to use "pause-114428" cluster and "default" namespace by default
	I1128 11:46:08.296763   23695 main.go:134] libmachine: SSH cmd err, output: <nil>: 
	I1128 11:46:08.296776   23695 main.go:134] libmachine: Detecting the provisioner...
	I1128 11:46:08.296782   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHHostname
	I1128 11:46:08.296950   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHPort
	I1128 11:46:08.297055   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHKeyPath
	I1128 11:46:08.297145   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHKeyPath
	I1128 11:46:08.297237   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHUsername
	I1128 11:46:08.297375   23695 main.go:134] libmachine: Using SSH client type: native
	I1128 11:46:08.297501   23695 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e7180] 0x13ea300 <nil>  [] 0s} 192.168.64.73 22 <nil> <nil>}
	I1128 11:46:08.297509   23695 main.go:134] libmachine: About to run SSH command:
	cat /etc/os-release
	I1128 11:46:08.365605   23695 main.go:134] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2021.02.12-1-g5c46c87-dirty
	ID=buildroot
	VERSION_ID=2021.02.12
	PRETTY_NAME="Buildroot 2021.02.12"
	
	I1128 11:46:08.365663   23695 main.go:134] libmachine: found compatible host: buildroot
	I1128 11:46:08.365670   23695 main.go:134] libmachine: Provisioning with buildroot...
	I1128 11:46:08.365676   23695 main.go:134] libmachine: (auto-113537) Calling .GetMachineName
	I1128 11:46:08.365810   23695 buildroot.go:166] provisioning hostname "auto-113537"
	I1128 11:46:08.365822   23695 main.go:134] libmachine: (auto-113537) Calling .GetMachineName
	I1128 11:46:08.365912   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHHostname
	I1128 11:46:08.366034   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHPort
	I1128 11:46:08.366123   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHKeyPath
	I1128 11:46:08.366206   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHKeyPath
	I1128 11:46:08.366293   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHUsername
	I1128 11:46:08.366429   23695 main.go:134] libmachine: Using SSH client type: native
	I1128 11:46:08.366552   23695 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e7180] 0x13ea300 <nil>  [] 0s} 192.168.64.73 22 <nil> <nil>}
	I1128 11:46:08.366562   23695 main.go:134] libmachine: About to run SSH command:
	sudo hostname auto-113537 && echo "auto-113537" | sudo tee /etc/hostname
	I1128 11:46:08.445275   23695 main.go:134] libmachine: SSH cmd err, output: <nil>: auto-113537
	
	I1128 11:46:08.445292   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHHostname
	I1128 11:46:08.445430   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHPort
	I1128 11:46:08.445527   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHKeyPath
	I1128 11:46:08.445615   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHKeyPath
	I1128 11:46:08.445722   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHUsername
	I1128 11:46:08.445841   23695 main.go:134] libmachine: Using SSH client type: native
	I1128 11:46:08.445957   23695 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e7180] 0x13ea300 <nil>  [] 0s} 192.168.64.73 22 <nil> <nil>}
	I1128 11:46:08.445968   23695 main.go:134] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sauto-113537' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 auto-113537/g' /etc/hosts;
				else 
					echo '127.0.1.1 auto-113537' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1128 11:46:08.519903   23695 main.go:134] libmachine: SSH cmd err, output: <nil>: 
	I1128 11:46:08.519925   23695 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/15411-14646/.minikube CaCertPath:/Users/jenkins/minikube-integration/15411-14646/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/15411-14646/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/15411-14646/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/15411-14646/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/15411-14646/.minikube}
	I1128 11:46:08.519942   23695 buildroot.go:174] setting up certificates
	I1128 11:46:08.519970   23695 provision.go:83] configureAuth start
	I1128 11:46:08.519981   23695 main.go:134] libmachine: (auto-113537) Calling .GetMachineName
	I1128 11:46:08.520115   23695 main.go:134] libmachine: (auto-113537) Calling .GetIP
	I1128 11:46:08.520190   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHHostname
	I1128 11:46:08.520287   23695 provision.go:138] copyHostCerts
	I1128 11:46:08.520381   23695 exec_runner.go:144] found /Users/jenkins/minikube-integration/15411-14646/.minikube/ca.pem, removing ...
	I1128 11:46:08.520391   23695 exec_runner.go:207] rm: /Users/jenkins/minikube-integration/15411-14646/.minikube/ca.pem
	I1128 11:46:08.520519   23695 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/15411-14646/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/15411-14646/.minikube/ca.pem (1078 bytes)
	I1128 11:46:08.520726   23695 exec_runner.go:144] found /Users/jenkins/minikube-integration/15411-14646/.minikube/cert.pem, removing ...
	I1128 11:46:08.520733   23695 exec_runner.go:207] rm: /Users/jenkins/minikube-integration/15411-14646/.minikube/cert.pem
	I1128 11:46:08.520809   23695 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/15411-14646/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/15411-14646/.minikube/cert.pem (1123 bytes)
	I1128 11:46:08.520980   23695 exec_runner.go:144] found /Users/jenkins/minikube-integration/15411-14646/.minikube/key.pem, removing ...
	I1128 11:46:08.520986   23695 exec_runner.go:207] rm: /Users/jenkins/minikube-integration/15411-14646/.minikube/key.pem
	I1128 11:46:08.521060   23695 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/15411-14646/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/15411-14646/.minikube/key.pem (1679 bytes)
	I1128 11:46:08.521191   23695 provision.go:112] generating server cert: /Users/jenkins/minikube-integration/15411-14646/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/15411-14646/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/15411-14646/.minikube/certs/ca-key.pem org=jenkins.auto-113537 san=[192.168.64.73 192.168.64.73 localhost 127.0.0.1 minikube auto-113537]
	I1128 11:46:08.643948   23695 provision.go:172] copyRemoteCerts
	I1128 11:46:08.644018   23695 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1128 11:46:08.644037   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHHostname
	I1128 11:46:08.644180   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHPort
	I1128 11:46:08.644286   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHKeyPath
	I1128 11:46:08.644370   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHUsername
	I1128 11:46:08.644445   23695 sshutil.go:53] new ssh client: &{IP:192.168.64.73 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/id_rsa Username:docker}
	I1128 11:46:08.684080   23695 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/15411-14646/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1128 11:46:08.699306   23695 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/15411-14646/.minikube/machines/server.pem --> /etc/docker/server.pem (1212 bytes)
	I1128 11:46:08.714277   23695 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/15411-14646/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I1128 11:46:08.729324   23695 provision.go:86] duration metric: configureAuth took 209.335033ms
	I1128 11:46:08.729335   23695 buildroot.go:189] setting minikube options for container-runtime
	I1128 11:46:08.729470   23695 config.go:180] Loaded profile config "auto-113537": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.3
	I1128 11:46:08.729510   23695 main.go:134] libmachine: (auto-113537) Calling .DriverName
	I1128 11:46:08.729641   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHHostname
	I1128 11:46:08.729729   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHPort
	I1128 11:46:08.729809   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHKeyPath
	I1128 11:46:08.729885   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHKeyPath
	I1128 11:46:08.729971   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHUsername
	I1128 11:46:08.730093   23695 main.go:134] libmachine: Using SSH client type: native
	I1128 11:46:08.730205   23695 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e7180] 0x13ea300 <nil>  [] 0s} 192.168.64.73 22 <nil> <nil>}
	I1128 11:46:08.730214   23695 main.go:134] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I1128 11:46:08.801346   23695 main.go:134] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I1128 11:46:08.801361   23695 buildroot.go:70] root file system type: tmpfs
	I1128 11:46:08.801476   23695 provision.go:309] Updating docker unit: /lib/systemd/system/docker.service ...
	I1128 11:46:08.801487   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHHostname
	I1128 11:46:08.801623   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHPort
	I1128 11:46:08.801734   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHKeyPath
	I1128 11:46:08.801822   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHKeyPath
	I1128 11:46:08.801920   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHUsername
	I1128 11:46:08.802106   23695 main.go:134] libmachine: Using SSH client type: native
	I1128 11:46:08.802223   23695 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e7180] 0x13ea300 <nil>  [] 0s} 192.168.64.73 22 <nil> <nil>}
	I1128 11:46:08.802266   23695 main.go:134] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I1128 11:46:08.878625   23695 main.go:134] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I1128 11:46:08.878649   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHHostname
	I1128 11:46:08.878785   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHPort
	I1128 11:46:08.878874   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHKeyPath
	I1128 11:46:08.878965   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHKeyPath
	I1128 11:46:08.879043   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHUsername
	I1128 11:46:08.879183   23695 main.go:134] libmachine: Using SSH client type: native
	I1128 11:46:08.879304   23695 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e7180] 0x13ea300 <nil>  [] 0s} 192.168.64.73 22 <nil> <nil>}
	I1128 11:46:08.879317   23695 main.go:134] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I1128 11:46:09.356411   23695 main.go:134] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I1128 11:46:09.356427   23695 main.go:134] libmachine: Checking connection to Docker...
	I1128 11:46:09.356433   23695 main.go:134] libmachine: (auto-113537) Calling .GetURL
	I1128 11:46:09.356567   23695 main.go:134] libmachine: Docker is up and running!
	I1128 11:46:09.356576   23695 main.go:134] libmachine: Reticulating splines...
	I1128 11:46:09.356580   23695 client.go:171] LocalClient.Create took 12.596235122s
	I1128 11:46:09.356590   23695 start.go:167] duration metric: libmachine.API.Create for "auto-113537" took 12.596275301s
	I1128 11:46:09.356600   23695 start.go:300] post-start starting for "auto-113537" (driver="hyperkit")
	I1128 11:46:09.356605   23695 start.go:328] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1128 11:46:09.356615   23695 main.go:134] libmachine: (auto-113537) Calling .DriverName
	I1128 11:46:09.356761   23695 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1128 11:46:09.356778   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHHostname
	I1128 11:46:09.356861   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHPort
	I1128 11:46:09.356955   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHKeyPath
	I1128 11:46:09.357041   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHUsername
	I1128 11:46:09.357123   23695 sshutil.go:53] new ssh client: &{IP:192.168.64.73 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/id_rsa Username:docker}
	I1128 11:46:09.403737   23695 ssh_runner.go:195] Run: cat /etc/os-release
	I1128 11:46:09.406805   23695 info.go:137] Remote host: Buildroot 2021.02.12
	I1128 11:46:09.406817   23695 filesync.go:126] Scanning /Users/jenkins/minikube-integration/15411-14646/.minikube/addons for local assets ...
	I1128 11:46:09.406920   23695 filesync.go:126] Scanning /Users/jenkins/minikube-integration/15411-14646/.minikube/files for local assets ...
	I1128 11:46:09.407114   23695 filesync.go:149] local asset: /Users/jenkins/minikube-integration/15411-14646/.minikube/files/etc/ssl/certs/158232.pem -> 158232.pem in /etc/ssl/certs
	I1128 11:46:09.407325   23695 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I1128 11:46:09.415492   23695 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/15411-14646/.minikube/files/etc/ssl/certs/158232.pem --> /etc/ssl/certs/158232.pem (1708 bytes)
	I1128 11:46:09.437472   23695 start.go:303] post-start completed in 80.861837ms
	I1128 11:46:09.437503   23695 main.go:134] libmachine: (auto-113537) Calling .GetConfigRaw
	I1128 11:46:09.438136   23695 main.go:134] libmachine: (auto-113537) Calling .GetIP
	I1128 11:46:09.438296   23695 profile.go:148] Saving config to /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/auto-113537/config.json ...
	I1128 11:46:09.438605   23695 start.go:128] duration metric: createHost completed in 12.725457565s
	I1128 11:46:09.438622   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHHostname
	I1128 11:46:09.438729   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHPort
	I1128 11:46:09.438817   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHKeyPath
	I1128 11:46:09.438900   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHKeyPath
	I1128 11:46:09.438994   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHUsername
	I1128 11:46:09.439109   23695 main.go:134] libmachine: Using SSH client type: native
	I1128 11:46:09.439206   23695 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e7180] 0x13ea300 <nil>  [] 0s} 192.168.64.73 22 <nil> <nil>}
	I1128 11:46:09.439214   23695 main.go:134] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I1128 11:46:09.512569   23695 main.go:134] libmachine: SSH cmd err, output: <nil>: 1669664769.574520710
	
	I1128 11:46:09.512580   23695 fix.go:207] guest clock: 1669664769.574520710
	I1128 11:46:09.512585   23695 fix.go:220] Guest: 2022-11-28 11:46:09.57452071 -0800 PST Remote: 2022-11-28 11:46:09.438615 -0800 PST m=+13.150165847 (delta=135.90571ms)
	I1128 11:46:09.512609   23695 fix.go:191] guest clock delta is within tolerance: 135.90571ms
	I1128 11:46:09.512613   23695 start.go:83] releasing machines lock for "auto-113537", held for 12.799615539s
	I1128 11:46:09.512631   23695 main.go:134] libmachine: (auto-113537) Calling .DriverName
	I1128 11:46:09.512762   23695 main.go:134] libmachine: (auto-113537) Calling .GetIP
	I1128 11:46:09.512844   23695 main.go:134] libmachine: (auto-113537) Calling .DriverName
	I1128 11:46:09.513131   23695 main.go:134] libmachine: (auto-113537) Calling .DriverName
	I1128 11:46:09.513234   23695 main.go:134] libmachine: (auto-113537) Calling .DriverName
	I1128 11:46:09.513304   23695 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1128 11:46:09.513336   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHHostname
	I1128 11:46:09.513349   23695 ssh_runner.go:195] Run: cat /version.json
	I1128 11:46:09.513360   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHHostname
	I1128 11:46:09.513447   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHPort
	I1128 11:46:09.513465   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHPort
	I1128 11:46:09.513588   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHKeyPath
	I1128 11:46:09.513612   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHKeyPath
	I1128 11:46:09.513680   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHUsername
	I1128 11:46:09.513702   23695 main.go:134] libmachine: (auto-113537) Calling .GetSSHUsername
	I1128 11:46:09.513753   23695 sshutil.go:53] new ssh client: &{IP:192.168.64.73 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/id_rsa Username:docker}
	I1128 11:46:09.513775   23695 sshutil.go:53] new ssh client: &{IP:192.168.64.73 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/auto-113537/id_rsa Username:docker}
	I1128 11:46:09.588384   23695 ssh_runner.go:195] Run: systemctl --version
	I1128 11:46:09.592902   23695 preload.go:132] Checking if preload exists for k8s version v1.25.3 and runtime docker
	I1128 11:46:09.593007   23695 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I1128 11:46:09.607780   23695 docker.go:613] Got preloaded images: 
	I1128 11:46:09.607792   23695 docker.go:619] registry.k8s.io/kube-apiserver:v1.25.3 wasn't preloaded
	I1128 11:46:09.607850   23695 ssh_runner.go:195] Run: sudo cat /var/lib/docker/image/overlay2/repositories.json
	I1128 11:46:09.614093   23695 ssh_runner.go:195] Run: which lz4
	I1128 11:46:09.616552   23695 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4
	I1128 11:46:09.619096   23695 ssh_runner.go:352] existence check for /preloaded.tar.lz4: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/preloaded.tar.lz4': No such file or directory
	I1128 11:46:09.619120   23695 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/15411-14646/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.25.3-docker-overlay2-amd64.tar.lz4 --> /preloaded.tar.lz4 (404166592 bytes)
	I1128 11:46:10.827984   23695 docker.go:577] Took 1.211465 seconds to copy over tarball
	I1128 11:46:10.828052   23695 ssh_runner.go:195] Run: sudo tar -I lz4 -C /var -xf /preloaded.tar.lz4
	
	* 
	* ==> Docker <==
	* -- Journal begins at Mon 2022-11-28 19:44:41 UTC, ends at Mon 2022-11-28 19:46:14 UTC. --
	Nov 28 19:45:47 pause-114428 dockerd[3927]: time="2022-11-28T19:45:47.939073752Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/9ead6b6b65f5804f29c1c11bac735e78dba4e73dd0ea7bc34808289436343d3e pid=5532 runtime=io.containerd.runc.v2
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.098426358Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.098563598Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.098574562Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.099290725Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/a1f9e3b2cd7454357bddb17295f4a1e35da4a58f1edacbdeb921181688bff5bf pid=5709 runtime=io.containerd.runc.v2
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.190361701Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.190396367Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.190409497Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.190614418Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/20905b729742236666d48a2d1afd75a7a28ef485f06f7e052006225c0ca8226e pid=5751 runtime=io.containerd.runc.v2
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.470362281Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.470523701Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.470550421Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.470852155Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/849f8cddc58d7686ef772e9e1c8845a70fba101ad08600f47df0f8d455845008 pid=5861 runtime=io.containerd.runc.v2
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.923869140Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.923931529Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.923940891Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Nov 28 19:45:52 pause-114428 dockerd[3927]: time="2022-11-28T19:45:52.924141542Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/e3ce284bd8adaaa9151908279892cbec5e5460b138a1fa7e6ec90e445f479e8c pid=5928 runtime=io.containerd.runc.v2
	Nov 28 19:46:07 pause-114428 dockerd[3927]: time="2022-11-28T19:46:07.889587265Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Nov 28 19:46:07 pause-114428 dockerd[3927]: time="2022-11-28T19:46:07.889645913Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Nov 28 19:46:07 pause-114428 dockerd[3927]: time="2022-11-28T19:46:07.889655186Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Nov 28 19:46:07 pause-114428 dockerd[3927]: time="2022-11-28T19:46:07.890181143Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/abce6d5dca595edd52e4a8355a41885daa2203a0494b505eef197f4b59c6878b pid=6156 runtime=io.containerd.runc.v2
	Nov 28 19:46:08 pause-114428 dockerd[3927]: time="2022-11-28T19:46:08.169171124Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Nov 28 19:46:08 pause-114428 dockerd[3927]: time="2022-11-28T19:46:08.169229784Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Nov 28 19:46:08 pause-114428 dockerd[3927]: time="2022-11-28T19:46:08.169239029Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Nov 28 19:46:08 pause-114428 dockerd[3927]: time="2022-11-28T19:46:08.169542788Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/ea817622d730befb6352b0a617af62b189382f856291405e1e4cae75244af470 pid=6204 runtime=io.containerd.runc.v2
	
	* 
	* ==> container status <==
	* CONTAINER           IMAGE               CREATED             STATE               NAME                      ATTEMPT             POD ID
	ea817622d730b       6e38f40d628db       6 seconds ago       Running             storage-provisioner       0                   abce6d5dca595
	e3ce284bd8ada       5185b96f0becf       22 seconds ago      Running             coredns                   2                   849f8cddc58d7
	20905b7297422       beaaf00edd38a       22 seconds ago      Running             kube-proxy                2                   a1f9e3b2cd745
	9ead6b6b65f58       0346dbd74bcb9       27 seconds ago      Running             kube-apiserver            3                   73d62198b6148
	2f771b319f03a       6039992312758       27 seconds ago      Running             kube-controller-manager   2                   cb2097f545b08
	f24fa2c61249f       6d23ec0e8b87e       27 seconds ago      Running             kube-scheduler            2                   7f1a6817926f8
	8ca995856c18a       a8a176a5d5d69       27 seconds ago      Running             etcd                      2                   be8a95ce71d21
	a9d36028607be       5185b96f0becf       29 seconds ago      Created             coredns                   1                   8e79d957b3d6d
	cb7ec3000bb98       0346dbd74bcb9       30 seconds ago      Created             kube-apiserver            2                   eca5b163aceeb
	7cf378f4872e2       6d23ec0e8b87e       38 seconds ago      Exited              kube-scheduler            1                   c2404a1daca8e
	cf3c8b0b6388f       a8a176a5d5d69       39 seconds ago      Exited              etcd                      1                   aaa7b0e997b98
	677c0d9be193a       6039992312758       40 seconds ago      Exited              kube-controller-manager   1                   71edc30d3dbb5
	d3c7e3e4259d7       beaaf00edd38a       40 seconds ago      Exited              kube-proxy                1                   81e0e82e398af
	
	* 
	* ==> coredns [a9d36028607b] <==
	* 
	* 
	* ==> coredns [e3ce284bd8ad] <==
	* .:53
	[INFO] plugin/reload: Running configuration SHA512 = 7135f430aea492809ab227b028bd16c96f6629e00404d9ec4f44cae029eb3743d1cfe4a9d0cc8fbbd4cfa53556972f2bbf615e7c9e8412e85d290539257166ad
	CoreDNS-1.9.3
	linux/amd64, go1.18.2, 45b0a11
	
	* 
	* ==> describe nodes <==
	* Name:               pause-114428
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=pause-114428
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=8ca606bc88c49b8633ca8bf16fc174bca0c3a74e
	                    minikube.k8s.io/name=pause-114428
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2022_11_28T11_45_12_0700
	                    minikube.k8s.io/version=v1.28.0
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Mon, 28 Nov 2022 19:45:11 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  pause-114428
	  AcquireTime:     <unset>
	  RenewTime:       Mon, 28 Nov 2022 19:46:11 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Mon, 28 Nov 2022 19:45:51 +0000   Mon, 28 Nov 2022 19:45:11 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Mon, 28 Nov 2022 19:45:51 +0000   Mon, 28 Nov 2022 19:45:11 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Mon, 28 Nov 2022 19:45:51 +0000   Mon, 28 Nov 2022 19:45:11 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Mon, 28 Nov 2022 19:45:51 +0000   Mon, 28 Nov 2022 19:45:12 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.64.71
	  Hostname:    pause-114428
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17784752Ki
	  hugepages-2Mi:      0
	  memory:             2017572Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17784752Ki
	  hugepages-2Mi:      0
	  memory:             2017572Ki
	  pods:               110
	System Info:
	  Machine ID:                 ec7b50837b2241e5af48c3dad42d792d
	  System UUID:                151111ed-0000-0000-94c8-f01898ef957c
	  Boot ID:                    a8c97c69-e70e-4c1a-82a4-f868cdd4b19a
	  Kernel Version:             5.10.57
	  OS Image:                   Buildroot 2021.02.12
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://20.10.21
	  Kubelet Version:            v1.25.3
	  Kube-Proxy Version:         v1.25.3
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (7 in total)
	  Namespace                   Name                                    CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                    ------------  ----------  ---------------  -------------  ---
	  kube-system                 coredns-565d847f94-hstbw                100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     51s
	  kube-system                 etcd-pause-114428                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (5%!)(MISSING)       0 (0%!)(MISSING)         63s
	  kube-system                 kube-apiserver-pause-114428             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         63s
	  kube-system                 kube-controller-manager-pause-114428    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         63s
	  kube-system                 kube-proxy-gm2kh                        0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         51s
	  kube-system                 kube-scheduler-pause-114428             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         63s
	  kube-system                 storage-provisioner                     0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         8s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  0 (0%!)(MISSING)
	  memory             170Mi (8%!)(MISSING)  170Mi (8%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                From             Message
	  ----    ------                   ----               ----             -------
	  Normal  Starting                 49s                kube-proxy       
	  Normal  Starting                 22s                kube-proxy       
	  Normal  NodeHasSufficientPID     63s                kubelet          Node pause-114428 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  63s                kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientMemory  63s                kubelet          Node pause-114428 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    63s                kubelet          Node pause-114428 status is now: NodeHasNoDiskPressure
	  Normal  NodeReady                63s                kubelet          Node pause-114428 status is now: NodeReady
	  Normal  Starting                 63s                kubelet          Starting kubelet.
	  Normal  RegisteredNode           51s                node-controller  Node pause-114428 event: Registered Node pause-114428 in Controller
	  Normal  Starting                 29s                kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  29s (x8 over 29s)  kubelet          Node pause-114428 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    29s (x8 over 29s)  kubelet          Node pause-114428 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     29s (x7 over 29s)  kubelet          Node pause-114428 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  29s                kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           12s                node-controller  Node pause-114428 event: Registered Node pause-114428 in Controller
	
	* 
	* ==> dmesg <==
	* [  +0.000002] systemd[1]: (This warning is only shown for the first unit using IP firewalling.)
	[  +1.872776] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000004] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000001] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +3.502021] systemd-fstab-generator[550]: Ignoring "noauto" for root device
	[  +0.080598] systemd-fstab-generator[561]: Ignoring "noauto" for root device
	[  +4.621130] systemd-fstab-generator[784]: Ignoring "noauto" for root device
	[  +1.389487] kauditd_printk_skb: 16 callbacks suppressed
	[  +0.211553] systemd-fstab-generator[946]: Ignoring "noauto" for root device
	[  +0.083763] systemd-fstab-generator[957]: Ignoring "noauto" for root device
	[  +0.095050] systemd-fstab-generator[968]: Ignoring "noauto" for root device
	[  +1.348467] systemd-fstab-generator[1118]: Ignoring "noauto" for root device
	[  +0.091839] systemd-fstab-generator[1129]: Ignoring "noauto" for root device
	[  +3.080266] systemd-fstab-generator[1343]: Ignoring "noauto" for root device
	[  +0.541704] kauditd_printk_skb: 68 callbacks suppressed
	[Nov28 19:45] systemd-fstab-generator[1999]: Ignoring "noauto" for root device
	[ +13.258785] kauditd_printk_skb: 8 callbacks suppressed
	[  +6.823620] kauditd_printk_skb: 20 callbacks suppressed
	[  +0.847816] systemd-fstab-generator[2971]: Ignoring "noauto" for root device
	[  +0.183187] systemd-fstab-generator[3040]: Ignoring "noauto" for root device
	[  +0.140719] systemd-fstab-generator[3051]: Ignoring "noauto" for root device
	[  +7.553990] systemd-fstab-generator[4373]: Ignoring "noauto" for root device
	[  +0.096660] systemd-fstab-generator[4384]: Ignoring "noauto" for root device
	[  +3.112545] kauditd_printk_skb: 31 callbacks suppressed
	[  +2.227867] systemd-fstab-generator[5154]: Ignoring "noauto" for root device
	
	* 
	* ==> etcd [8ca995856c18] <==
	* {"level":"info","ts":"2022-11-28T19:45:47.772Z","caller":"etcdserver/server.go:851","msg":"starting etcd server","local-member-id":"9d8eb8badbf65f53","local-server-version":"3.5.4","cluster-version":"to_be_decided"}
	{"level":"info","ts":"2022-11-28T19:45:47.773Z","caller":"etcdserver/server.go:752","msg":"starting initial election tick advance","election-ticks":10}
	{"level":"info","ts":"2022-11-28T19:45:47.773Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"9d8eb8badbf65f53 switched to configuration voters=(11353214823341383507)"}
	{"level":"info","ts":"2022-11-28T19:45:47.773Z","caller":"membership/cluster.go:421","msg":"added member","cluster-id":"148f46ea0e83aed3","local-member-id":"9d8eb8badbf65f53","added-peer-id":"9d8eb8badbf65f53","added-peer-peer-urls":["https://192.168.64.71:2380"]}
	{"level":"info","ts":"2022-11-28T19:45:47.774Z","caller":"membership/cluster.go:584","msg":"set initial cluster version","cluster-id":"148f46ea0e83aed3","local-member-id":"9d8eb8badbf65f53","cluster-version":"3.5"}
	{"level":"info","ts":"2022-11-28T19:45:47.774Z","caller":"api/capability.go:75","msg":"enabled capabilities for version","cluster-version":"3.5"}
	{"level":"info","ts":"2022-11-28T19:45:47.774Z","caller":"embed/etcd.go:688","msg":"starting with client TLS","tls-info":"cert = /var/lib/minikube/certs/etcd/server.crt, key = /var/lib/minikube/certs/etcd/server.key, client-cert=, client-key=, trusted-ca = /var/lib/minikube/certs/etcd/ca.crt, client-cert-auth = true, crl-file = ","cipher-suites":[]}
	{"level":"info","ts":"2022-11-28T19:45:47.774Z","caller":"embed/etcd.go:277","msg":"now serving peer/client/metrics","local-member-id":"9d8eb8badbf65f53","initial-advertise-peer-urls":["https://192.168.64.71:2380"],"listen-peer-urls":["https://192.168.64.71:2380"],"advertise-client-urls":["https://192.168.64.71:2379"],"listen-client-urls":["https://127.0.0.1:2379","https://192.168.64.71:2379"],"listen-metrics-urls":["http://127.0.0.1:2381"]}
	{"level":"info","ts":"2022-11-28T19:45:47.774Z","caller":"embed/etcd.go:763","msg":"serving metrics","address":"http://127.0.0.1:2381"}
	{"level":"info","ts":"2022-11-28T19:45:47.775Z","caller":"embed/etcd.go:581","msg":"serving peer traffic","address":"192.168.64.71:2380"}
	{"level":"info","ts":"2022-11-28T19:45:47.775Z","caller":"embed/etcd.go:553","msg":"cmux::serve","address":"192.168.64.71:2380"}
	{"level":"info","ts":"2022-11-28T19:45:49.565Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"9d8eb8badbf65f53 is starting a new election at term 3"}
	{"level":"info","ts":"2022-11-28T19:45:49.565Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"9d8eb8badbf65f53 became pre-candidate at term 3"}
	{"level":"info","ts":"2022-11-28T19:45:49.565Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"9d8eb8badbf65f53 received MsgPreVoteResp from 9d8eb8badbf65f53 at term 3"}
	{"level":"info","ts":"2022-11-28T19:45:49.565Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"9d8eb8badbf65f53 became candidate at term 4"}
	{"level":"info","ts":"2022-11-28T19:45:49.565Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"9d8eb8badbf65f53 received MsgVoteResp from 9d8eb8badbf65f53 at term 4"}
	{"level":"info","ts":"2022-11-28T19:45:49.565Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"9d8eb8badbf65f53 became leader at term 4"}
	{"level":"info","ts":"2022-11-28T19:45:49.565Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: 9d8eb8badbf65f53 elected leader 9d8eb8badbf65f53 at term 4"}
	{"level":"info","ts":"2022-11-28T19:45:49.570Z","caller":"etcdserver/server.go:2042","msg":"published local member to cluster through raft","local-member-id":"9d8eb8badbf65f53","local-member-attributes":"{Name:pause-114428 ClientURLs:[https://192.168.64.71:2379]}","request-path":"/0/members/9d8eb8badbf65f53/attributes","cluster-id":"148f46ea0e83aed3","publish-timeout":"7s"}
	{"level":"info","ts":"2022-11-28T19:45:49.570Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
	{"level":"info","ts":"2022-11-28T19:45:49.571Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2022-11-28T19:45:49.572Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
	{"level":"info","ts":"2022-11-28T19:45:49.572Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"192.168.64.71:2379"}
	{"level":"info","ts":"2022-11-28T19:45:49.583Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2022-11-28T19:45:49.583Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	
	* 
	* ==> etcd [cf3c8b0b6388] <==
	* {"level":"info","ts":"2022-11-28T19:45:35.804Z","caller":"embed/etcd.go:581","msg":"serving peer traffic","address":"192.168.64.71:2380"}
	{"level":"info","ts":"2022-11-28T19:45:35.804Z","caller":"embed/etcd.go:553","msg":"cmux::serve","address":"192.168.64.71:2380"}
	{"level":"info","ts":"2022-11-28T19:45:36.794Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"9d8eb8badbf65f53 is starting a new election at term 2"}
	{"level":"info","ts":"2022-11-28T19:45:36.794Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"9d8eb8badbf65f53 became pre-candidate at term 2"}
	{"level":"info","ts":"2022-11-28T19:45:36.794Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"9d8eb8badbf65f53 received MsgPreVoteResp from 9d8eb8badbf65f53 at term 2"}
	{"level":"info","ts":"2022-11-28T19:45:36.794Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"9d8eb8badbf65f53 became candidate at term 3"}
	{"level":"info","ts":"2022-11-28T19:45:36.794Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"9d8eb8badbf65f53 received MsgVoteResp from 9d8eb8badbf65f53 at term 3"}
	{"level":"info","ts":"2022-11-28T19:45:36.794Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"9d8eb8badbf65f53 became leader at term 3"}
	{"level":"info","ts":"2022-11-28T19:45:36.794Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: 9d8eb8badbf65f53 elected leader 9d8eb8badbf65f53 at term 3"}
	{"level":"info","ts":"2022-11-28T19:45:36.795Z","caller":"etcdserver/server.go:2042","msg":"published local member to cluster through raft","local-member-id":"9d8eb8badbf65f53","local-member-attributes":"{Name:pause-114428 ClientURLs:[https://192.168.64.71:2379]}","request-path":"/0/members/9d8eb8badbf65f53/attributes","cluster-id":"148f46ea0e83aed3","publish-timeout":"7s"}
	{"level":"info","ts":"2022-11-28T19:45:36.795Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
	{"level":"info","ts":"2022-11-28T19:45:36.796Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"192.168.64.71:2379"}
	{"level":"info","ts":"2022-11-28T19:45:36.796Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2022-11-28T19:45:36.796Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"info","ts":"2022-11-28T19:45:36.796Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
	{"level":"info","ts":"2022-11-28T19:45:36.798Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"127.0.0.1:2379"}
	WARNING: 2022/11/28 19:45:40 [core] grpc: Server.processUnaryRPC failed to write status connection error: desc = "transport is closing"
	{"level":"info","ts":"2022-11-28T19:45:40.086Z","caller":"osutil/interrupt_unix.go:64","msg":"received signal; shutting down","signal":"terminated"}
	{"level":"info","ts":"2022-11-28T19:45:40.086Z","caller":"embed/etcd.go:368","msg":"closing etcd server","name":"pause-114428","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.64.71:2380"],"advertise-client-urls":["https://192.168.64.71:2379"]}
	WARNING: 2022/11/28 19:45:40 [core] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 127.0.0.1:2379 <nil> 0 <nil>}. Err: connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
	WARNING: 2022/11/28 19:45:40 [core] grpc: addrConn.createTransport failed to connect to {192.168.64.71:2379 192.168.64.71:2379 <nil> 0 <nil>}. Err: connection error: desc = "transport: Error while dialing dial tcp 192.168.64.71:2379: connect: connection refused". Reconnecting...
	{"level":"info","ts":"2022-11-28T19:45:40.140Z","caller":"etcdserver/server.go:1453","msg":"skipped leadership transfer for single voting member cluster","local-member-id":"9d8eb8badbf65f53","current-leader-member-id":"9d8eb8badbf65f53"}
	{"level":"info","ts":"2022-11-28T19:45:40.141Z","caller":"embed/etcd.go:563","msg":"stopping serving peer traffic","address":"192.168.64.71:2380"}
	{"level":"info","ts":"2022-11-28T19:45:40.143Z","caller":"embed/etcd.go:568","msg":"stopped serving peer traffic","address":"192.168.64.71:2380"}
	{"level":"info","ts":"2022-11-28T19:45:40.143Z","caller":"embed/etcd.go:370","msg":"closed etcd server","name":"pause-114428","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.64.71:2380"],"advertise-client-urls":["https://192.168.64.71:2379"]}
	
	* 
	* ==> kernel <==
	*  19:46:15 up 1 min,  0 users,  load average: 0.85, 0.33, 0.12
	Linux pause-114428 5.10.57 #1 SMP Thu Nov 17 20:18:45 UTC 2022 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2021.02.12"
	
	* 
	* ==> kube-apiserver [9ead6b6b65f5] <==
	* I1128 19:45:51.391536       1 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	I1128 19:45:51.419748       1 controller.go:85] Starting OpenAPI controller
	I1128 19:45:51.420165       1 controller.go:85] Starting OpenAPI V3 controller
	I1128 19:45:51.420240       1 naming_controller.go:291] Starting NamingConditionController
	I1128 19:45:51.420375       1 establishing_controller.go:76] Starting EstablishingController
	I1128 19:45:51.420487       1 nonstructuralschema_controller.go:192] Starting NonStructuralSchemaConditionController
	I1128 19:45:51.420544       1 apiapproval_controller.go:186] Starting KubernetesAPIApprovalPolicyConformantConditionController
	I1128 19:45:51.420636       1 crd_finalizer.go:266] Starting CRDFinalizer
	I1128 19:45:51.495238       1 shared_informer.go:262] Caches are synced for crd-autoregister
	I1128 19:45:51.497265       1 cache.go:39] Caches are synced for APIServiceRegistrationController controller
	I1128 19:45:51.501641       1 cache.go:39] Caches are synced for autoregister controller
	I1128 19:45:51.501852       1 shared_informer.go:262] Caches are synced for cluster_authentication_trust_controller
	I1128 19:45:51.502356       1 apf_controller.go:305] Running API Priority and Fairness config worker
	I1128 19:45:51.515859       1 shared_informer.go:262] Caches are synced for node_authorizer
	I1128 19:45:51.529117       1 cache.go:39] Caches are synced for AvailableConditionController controller
	I1128 19:45:51.558519       1 controller.go:616] quota admission added evaluator for: leases.coordination.k8s.io
	I1128 19:45:52.202910       1 controller.go:132] OpenAPI AggregationController: action for item k8s_internal_local_delegation_chain_0000000000: Nothing (removed from the queue).
	I1128 19:45:52.402628       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I1128 19:45:53.049729       1 controller.go:616] quota admission added evaluator for: serviceaccounts
	I1128 19:45:53.057998       1 controller.go:616] quota admission added evaluator for: deployments.apps
	I1128 19:45:53.080027       1 controller.go:616] quota admission added evaluator for: daemonsets.apps
	I1128 19:45:53.095389       1 controller.go:616] quota admission added evaluator for: roles.rbac.authorization.k8s.io
	I1128 19:45:53.099831       1 controller.go:616] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I1128 19:46:03.782524       1 controller.go:616] quota admission added evaluator for: endpoints
	I1128 19:46:03.785336       1 controller.go:616] quota admission added evaluator for: endpointslices.discovery.k8s.io
	
	* 
	* ==> kube-apiserver [cb7ec3000bb9] <==
	* 
	* 
	* ==> kube-controller-manager [2f771b319f03] <==
	* I1128 19:46:03.803704       1 shared_informer.go:262] Caches are synced for namespace
	I1128 19:46:03.803906       1 shared_informer.go:262] Caches are synced for attach detach
	I1128 19:46:03.805390       1 shared_informer.go:262] Caches are synced for PVC protection
	I1128 19:46:03.805588       1 shared_informer.go:262] Caches are synced for TTL
	I1128 19:46:03.807494       1 shared_informer.go:262] Caches are synced for TTL after finished
	I1128 19:46:03.810908       1 shared_informer.go:262] Caches are synced for PV protection
	I1128 19:46:03.811087       1 shared_informer.go:262] Caches are synced for disruption
	I1128 19:46:03.813835       1 shared_informer.go:262] Caches are synced for taint
	I1128 19:46:03.814002       1 taint_manager.go:204] "Starting NoExecuteTaintManager"
	I1128 19:46:03.814147       1 taint_manager.go:209] "Sending events to api server"
	I1128 19:46:03.814924       1 node_lifecycle_controller.go:1443] Initializing eviction metric for zone: 
	W1128 19:46:03.815014       1 node_lifecycle_controller.go:1058] Missing timestamp for Node pause-114428. Assuming now as a timestamp.
	I1128 19:46:03.815095       1 node_lifecycle_controller.go:1259] Controller detected that zone  is now in state Normal.
	I1128 19:46:03.815185       1 event.go:294] "Event occurred" object="pause-114428" fieldPath="" kind="Node" apiVersion="v1" type="Normal" reason="RegisteredNode" message="Node pause-114428 event: Registered Node pause-114428 in Controller"
	I1128 19:46:03.824085       1 shared_informer.go:262] Caches are synced for stateful set
	I1128 19:46:03.826573       1 shared_informer.go:262] Caches are synced for ephemeral
	I1128 19:46:03.827910       1 shared_informer.go:262] Caches are synced for crt configmap
	I1128 19:46:03.832526       1 shared_informer.go:262] Caches are synced for GC
	I1128 19:46:03.892263       1 shared_informer.go:262] Caches are synced for resource quota
	I1128 19:46:03.915317       1 shared_informer.go:262] Caches are synced for persistent volume
	I1128 19:46:03.915496       1 shared_informer.go:262] Caches are synced for resource quota
	I1128 19:46:03.921779       1 shared_informer.go:262] Caches are synced for HPA
	I1128 19:46:04.348738       1 shared_informer.go:262] Caches are synced for garbage collector
	I1128 19:46:04.378484       1 shared_informer.go:262] Caches are synced for garbage collector
	I1128 19:46:04.378549       1 garbagecollector.go:163] Garbage collector: all resource monitors have synced. Proceeding to collect garbage
	
	* 
	* ==> kube-controller-manager [677c0d9be193] <==
	* I1128 19:45:35.295899       1 serving.go:348] Generated self-signed cert in-memory
	I1128 19:45:35.739194       1 controllermanager.go:178] Version: v1.25.3
	I1128 19:45:35.739272       1 controllermanager.go:180] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I1128 19:45:35.740120       1 secure_serving.go:210] Serving securely on 127.0.0.1:10257
	I1128 19:45:35.740242       1 tlsconfig.go:240] "Starting DynamicServingCertificateController"
	I1128 19:45:35.740397       1 dynamic_cafile_content.go:157] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I1128 19:45:35.740906       1 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	I1128 19:45:37.085997       1 shared_informer.go:255] Waiting for caches to sync for tokens
	F1128 19:45:37.110344       1 client_builder_dynamic.go:138] Get "https://192.168.64.71:8443/api/v1/namespaces/kube-system/serviceaccounts/disruption-controller": dial tcp 192.168.64.71:8443: connect: connection refused
	
	* 
	* ==> kube-proxy [20905b729742] <==
	* I1128 19:45:52.278223       1 node.go:163] Successfully retrieved node IP: 192.168.64.71
	I1128 19:45:52.278294       1 server_others.go:138] "Detected node IP" address="192.168.64.71"
	I1128 19:45:52.278309       1 server_others.go:578] "Unknown proxy mode, assuming iptables proxy" proxyMode=""
	I1128 19:45:52.300428       1 server_others.go:199] "kube-proxy running in single-stack mode, this ipFamily is not supported" ipFamily=IPv6
	I1128 19:45:52.300467       1 server_others.go:206] "Using iptables Proxier"
	I1128 19:45:52.300510       1 proxier.go:262] "Setting route_localnet=1, use nodePortAddresses to filter loopback addresses for NodePorts to skip it https://issues.k8s.io/90259"
	I1128 19:45:52.300719       1 server.go:661] "Version info" version="v1.25.3"
	I1128 19:45:52.300746       1 server.go:663] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I1128 19:45:52.301173       1 config.go:317] "Starting service config controller"
	I1128 19:45:52.301203       1 shared_informer.go:255] Waiting for caches to sync for service config
	I1128 19:45:52.301218       1 config.go:226] "Starting endpoint slice config controller"
	I1128 19:45:52.301222       1 shared_informer.go:255] Waiting for caches to sync for endpoint slice config
	I1128 19:45:52.301572       1 config.go:444] "Starting node config controller"
	I1128 19:45:52.301599       1 shared_informer.go:255] Waiting for caches to sync for node config
	I1128 19:45:52.401733       1 shared_informer.go:262] Caches are synced for node config
	I1128 19:45:52.401817       1 shared_informer.go:262] Caches are synced for service config
	I1128 19:45:52.401834       1 shared_informer.go:262] Caches are synced for endpoint slice config
	
	* 
	* ==> kube-proxy [d3c7e3e4259d] <==
	* I1128 19:45:37.083278       1 node.go:163] Successfully retrieved node IP: 192.168.64.71
	I1128 19:45:37.083358       1 server_others.go:138] "Detected node IP" address="192.168.64.71"
	I1128 19:45:37.083374       1 server_others.go:578] "Unknown proxy mode, assuming iptables proxy" proxyMode=""
	I1128 19:45:37.244908       1 server_others.go:199] "kube-proxy running in single-stack mode, this ipFamily is not supported" ipFamily=IPv6
	I1128 19:45:37.245016       1 server_others.go:206] "Using iptables Proxier"
	I1128 19:45:37.245128       1 proxier.go:262] "Setting route_localnet=1, use nodePortAddresses to filter loopback addresses for NodePorts to skip it https://issues.k8s.io/90259"
	I1128 19:45:37.245414       1 server.go:661] "Version info" version="v1.25.3"
	I1128 19:45:37.245622       1 server.go:663] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I1128 19:45:37.246202       1 config.go:317] "Starting service config controller"
	I1128 19:45:37.246265       1 shared_informer.go:255] Waiting for caches to sync for service config
	I1128 19:45:37.246308       1 config.go:226] "Starting endpoint slice config controller"
	I1128 19:45:37.246385       1 shared_informer.go:255] Waiting for caches to sync for endpoint slice config
	I1128 19:45:37.246970       1 config.go:444] "Starting node config controller"
	I1128 19:45:37.247017       1 shared_informer.go:255] Waiting for caches to sync for node config
	E1128 19:45:37.247437       1 event_broadcaster.go:262] Unable to write event: 'Post "https://control-plane.minikube.internal:8443/apis/events.k8s.io/v1/namespaces/default/events": dial tcp 192.168.64.71:8443: connect: connection refused' (may retry after sleeping)
	W1128 19:45:37.247538       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Node: Get "https://control-plane.minikube.internal:8443/api/v1/nodes?fieldSelector=metadata.name%!D(MISSING)pause-114428&limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	E1128 19:45:37.247603       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://control-plane.minikube.internal:8443/api/v1/nodes?fieldSelector=metadata.name%!D(MISSING)pause-114428&limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	W1128 19:45:37.247675       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.EndpointSlice: Get "https://control-plane.minikube.internal:8443/apis/discovery.k8s.io/v1/endpointslices?labelSelector=%!s(MISSING)ervice.kubernetes.io%!F(MISSING)headless%!C(MISSING)%!s(MISSING)ervice.kubernetes.io%!F(MISSING)service-proxy-name&limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	E1128 19:45:37.247836       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.EndpointSlice: failed to list *v1.EndpointSlice: Get "https://control-plane.minikube.internal:8443/apis/discovery.k8s.io/v1/endpointslices?labelSelector=%!s(MISSING)ervice.kubernetes.io%!F(MISSING)headless%!C(MISSING)%!s(MISSING)ervice.kubernetes.io%!F(MISSING)service-proxy-name&limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	W1128 19:45:37.247942       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Service: Get "https://control-plane.minikube.internal:8443/api/v1/services?labelSelector=%!s(MISSING)ervice.kubernetes.io%!F(MISSING)headless%!C(MISSING)%!s(MISSING)ervice.kubernetes.io%!F(MISSING)service-proxy-name&limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	E1128 19:45:37.248046       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://control-plane.minikube.internal:8443/api/v1/services?labelSelector=%!s(MISSING)ervice.kubernetes.io%!F(MISSING)headless%!C(MISSING)%!s(MISSING)ervice.kubernetes.io%!F(MISSING)service-proxy-name&limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	
	* 
	* ==> kube-scheduler [7cf378f4872e] <==
	* W1128 19:45:37.579193       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Node: Get "https://192.168.64.71:8443/api/v1/nodes?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	E1128 19:45:37.579209       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://192.168.64.71:8443/api/v1/nodes?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	W1128 19:45:37.579255       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.PodDisruptionBudget: Get "https://192.168.64.71:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	E1128 19:45:37.579273       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PodDisruptionBudget: failed to list *v1.PodDisruptionBudget: Get "https://192.168.64.71:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	W1128 19:45:37.579313       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.CSIDriver: Get "https://192.168.64.71:8443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	E1128 19:45:37.579326       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://192.168.64.71:8443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	W1128 19:45:37.579369       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.StorageClass: Get "https://192.168.64.71:8443/apis/storage.k8s.io/v1/storageclasses?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	E1128 19:45:37.579385       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: Get "https://192.168.64.71:8443/apis/storage.k8s.io/v1/storageclasses?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	W1128 19:45:37.579431       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.StatefulSet: Get "https://192.168.64.71:8443/apis/apps/v1/statefulsets?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	E1128 19:45:37.579446       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: Get "https://192.168.64.71:8443/apis/apps/v1/statefulsets?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	W1128 19:45:37.579491       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.CSIStorageCapacity: Get "https://192.168.64.71:8443/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	E1128 19:45:37.579509       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.CSIStorageCapacity: failed to list *v1.CSIStorageCapacity: Get "https://192.168.64.71:8443/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	W1128 19:45:37.579561       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.PersistentVolumeClaim: Get "https://192.168.64.71:8443/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	E1128 19:45:37.579577       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: Get "https://192.168.64.71:8443/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	W1128 19:45:37.579626       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.ReplicaSet: Get "https://192.168.64.71:8443/apis/apps/v1/replicasets?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	E1128 19:45:37.579642       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: Get "https://192.168.64.71:8443/apis/apps/v1/replicasets?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	W1128 19:45:37.579689       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.ReplicationController: Get "https://192.168.64.71:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	E1128 19:45:37.579706       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: Get "https://192.168.64.71:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	W1128 19:45:37.579782       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Pod: Get "https://192.168.64.71:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	E1128 19:45:37.579821       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Pod: failed to list *v1.Pod: Get "https://192.168.64.71:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.168.64.71:8443: connect: connection refused
	I1128 19:45:40.192257       1 secure_serving.go:255] Stopped listening on 127.0.0.1:10259
	I1128 19:45:40.192312       1 tlsconfig.go:255] "Shutting down DynamicServingCertificateController"
	E1128 19:45:40.192372       1 shared_informer.go:258] unable to sync caches for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	I1128 19:45:40.192382       1 configmap_cafile_content.go:210] "Shutting down controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	E1128 19:45:40.192653       1 run.go:74] "command failed" err="finished without leader elect"
	
	* 
	* ==> kube-scheduler [f24fa2c61249] <==
	* I1128 19:45:48.943389       1 serving.go:348] Generated self-signed cert in-memory
	W1128 19:45:51.431124       1 requestheader_controller.go:193] Unable to get configmap/extension-apiserver-authentication in kube-system.  Usually fixed by 'kubectl create rolebinding -n kube-system ROLEBINDING_NAME --role=extension-apiserver-authentication-reader --serviceaccount=YOUR_NS:YOUR_SA'
	W1128 19:45:51.431162       1 authentication.go:346] Error looking up in-cluster authentication configuration: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot get resource "configmaps" in API group "" in the namespace "kube-system"
	W1128 19:45:51.431173       1 authentication.go:347] Continuing without authentication configuration. This may treat all requests as anonymous.
	W1128 19:45:51.431178       1 authentication.go:348] To require authentication configuration lookup to succeed, set --authentication-tolerate-lookup-failure=false
	I1128 19:45:51.489608       1 server.go:148] "Starting Kubernetes Scheduler" version="v1.25.3"
	I1128 19:45:51.489679       1 server.go:150] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I1128 19:45:51.490452       1 secure_serving.go:210] Serving securely on 127.0.0.1:10259
	I1128 19:45:51.490664       1 tlsconfig.go:240] "Starting DynamicServingCertificateController"
	I1128 19:45:51.490685       1 configmap_cafile_content.go:202] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1128 19:45:51.492612       1 shared_informer.go:255] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	I1128 19:45:51.592870       1 shared_informer.go:262] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	* 
	* ==> kubelet <==
	* -- Journal begins at Mon 2022-11-28 19:44:41 UTC, ends at Mon 2022-11-28 19:46:16 UTC. --
	Nov 28 19:45:50 pause-114428 kubelet[5160]: E1128 19:45:50.888347    5160 kubelet.go:2448] "Error getting node" err="node \"pause-114428\" not found"
	Nov 28 19:45:50 pause-114428 kubelet[5160]: E1128 19:45:50.989327    5160 kubelet.go:2448] "Error getting node" err="node \"pause-114428\" not found"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: E1128 19:45:51.090157    5160 kubelet.go:2448] "Error getting node" err="node \"pause-114428\" not found"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: E1128 19:45:51.191131    5160 kubelet.go:2448] "Error getting node" err="node \"pause-114428\" not found"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: E1128 19:45:51.291436    5160 kubelet.go:2448] "Error getting node" err="node \"pause-114428\" not found"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: E1128 19:45:51.392560    5160 kubelet.go:2448] "Error getting node" err="node \"pause-114428\" not found"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: I1128 19:45:51.452540    5160 apiserver.go:52] "Watching apiserver"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: I1128 19:45:51.457340    5160 topology_manager.go:205] "Topology Admit Handler"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: I1128 19:45:51.457522    5160 topology_manager.go:205] "Topology Admit Handler"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: I1128 19:45:51.493456    5160 kuberuntime_manager.go:1050] "Updating runtime config through cri with podcidr" CIDR="10.244.0.0/24"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: I1128 19:45:51.494470    5160 kubelet_network.go:60] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="10.244.0.0/24"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: I1128 19:45:51.572696    5160 kubelet_node_status.go:108] "Node was previously registered" node="pause-114428"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: I1128 19:45:51.572896    5160 kubelet_node_status.go:73] "Successfully registered node" node="pause-114428"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: I1128 19:45:51.587359    5160 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/5f48a047-d53f-4630-beec-4846d0327f1f-kube-proxy\") pod \"kube-proxy-gm2kh\" (UID: \"5f48a047-d53f-4630-beec-4846d0327f1f\") " pod="kube-system/kube-proxy-gm2kh"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: I1128 19:45:51.587403    5160 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/5f48a047-d53f-4630-beec-4846d0327f1f-xtables-lock\") pod \"kube-proxy-gm2kh\" (UID: \"5f48a047-d53f-4630-beec-4846d0327f1f\") " pod="kube-system/kube-proxy-gm2kh"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: I1128 19:45:51.587423    5160 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m896r\" (UniqueName: \"kubernetes.io/projected/5f48a047-d53f-4630-beec-4846d0327f1f-kube-api-access-m896r\") pod \"kube-proxy-gm2kh\" (UID: \"5f48a047-d53f-4630-beec-4846d0327f1f\") " pod="kube-system/kube-proxy-gm2kh"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: I1128 19:45:51.587438    5160 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5f48a047-d53f-4630-beec-4846d0327f1f-lib-modules\") pod \"kube-proxy-gm2kh\" (UID: \"5f48a047-d53f-4630-beec-4846d0327f1f\") " pod="kube-system/kube-proxy-gm2kh"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: I1128 19:45:51.587453    5160 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/21d18ebd-7324-4815-857d-aa2cea270e10-config-volume\") pod \"coredns-565d847f94-hstbw\" (UID: \"21d18ebd-7324-4815-857d-aa2cea270e10\") " pod="kube-system/coredns-565d847f94-hstbw"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: I1128 19:45:51.587467    5160 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mc7k\" (UniqueName: \"kubernetes.io/projected/21d18ebd-7324-4815-857d-aa2cea270e10-kube-api-access-9mc7k\") pod \"coredns-565d847f94-hstbw\" (UID: \"21d18ebd-7324-4815-857d-aa2cea270e10\") " pod="kube-system/coredns-565d847f94-hstbw"
	Nov 28 19:45:51 pause-114428 kubelet[5160]: I1128 19:45:51.587475    5160 reconciler.go:169] "Reconciler: start to sync state"
	Nov 28 19:45:55 pause-114428 kubelet[5160]: I1128 19:45:55.048441    5160 prober_manager.go:287] "Failed to trigger a manual run" probe="Readiness"
	Nov 28 19:46:07 pause-114428 kubelet[5160]: I1128 19:46:07.448454    5160 topology_manager.go:205] "Topology Admit Handler"
	Nov 28 19:46:07 pause-114428 kubelet[5160]: I1128 19:46:07.533903    5160 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xg5f\" (UniqueName: \"kubernetes.io/projected/f3bb3ce4-fe67-4d5b-9a95-048c18b13469-kube-api-access-7xg5f\") pod \"storage-provisioner\" (UID: \"f3bb3ce4-fe67-4d5b-9a95-048c18b13469\") " pod="kube-system/storage-provisioner"
	Nov 28 19:46:07 pause-114428 kubelet[5160]: I1128 19:46:07.534114    5160 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/host-path/f3bb3ce4-fe67-4d5b-9a95-048c18b13469-tmp\") pod \"storage-provisioner\" (UID: \"f3bb3ce4-fe67-4d5b-9a95-048c18b13469\") " pod="kube-system/storage-provisioner"
	Nov 28 19:46:08 pause-114428 kubelet[5160]: I1128 19:46:08.130797    5160 pod_container_deletor.go:79] "Container not found in pod's containers" containerID="abce6d5dca595edd52e4a8355a41885daa2203a0494b505eef197f4b59c6878b"
	
	* 
	* ==> storage-provisioner [ea817622d730] <==
	* I1128 19:46:08.218499       1 storage_provisioner.go:116] Initializing the minikube storage provisioner...
	I1128 19:46:08.227072       1 storage_provisioner.go:141] Storage provisioner initialized, now starting service!
	I1128 19:46:08.227155       1 leaderelection.go:243] attempting to acquire leader lease kube-system/k8s.io-minikube-hostpath...
	I1128 19:46:08.231884       1 leaderelection.go:253] successfully acquired lease kube-system/k8s.io-minikube-hostpath
	I1128 19:46:08.232532       1 controller.go:835] Starting provisioner controller k8s.io/minikube-hostpath_pause-114428_f69256bb-0bd9-410a-9c26-d14dac5e0be7!
	I1128 19:46:08.235299       1 event.go:282] Event(v1.ObjectReference{Kind:"Endpoints", Namespace:"kube-system", Name:"k8s.io-minikube-hostpath", UID:"f4dc94b1-53db-49f8-ac60-7ccaf0a5fc0a", APIVersion:"v1", ResourceVersion:"473", FieldPath:""}): type: 'Normal' reason: 'LeaderElection' pause-114428_f69256bb-0bd9-410a-9c26-d14dac5e0be7 became leader
	I1128 19:46:08.333291       1 controller.go:884] Started provisioner controller k8s.io/minikube-hostpath_pause-114428_f69256bb-0bd9-410a-9c26-d14dac5e0be7!
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p pause-114428 -n pause-114428
helpers_test.go:261: (dbg) Run:  kubectl --context pause-114428 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:270: non-running pods: 
helpers_test.go:272: ======> post-mortem[TestPause/serial/SecondStartNoReconfiguration]: describe non-running pods <======
helpers_test.go:275: (dbg) Run:  kubectl --context pause-114428 describe pod 
helpers_test.go:275: (dbg) Non-zero exit: kubectl --context pause-114428 describe pod : exit status 1 (38.06555ms)

                                                
                                                
** stderr ** 
	error: resource name may not be empty

                                                
                                                
** /stderr **
helpers_test.go:277: kubectl --context pause-114428 describe pod : exit status 1
--- FAIL: TestPause/serial/SecondStartNoReconfiguration (47.90s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/HairPin (53.96s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/HairPin
net_test.go:238: (dbg) Run:  kubectl --context kubenet-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
E1128 11:56:06.868796   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/custom-flannel-113537/client.crt: no such file or directory
E1128 11:56:06.874897   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/custom-flannel-113537/client.crt: no such file or directory
E1128 11:56:06.885013   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/custom-flannel-113537/client.crt: no such file or directory
E1128 11:56:06.907119   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/custom-flannel-113537/client.crt: no such file or directory
E1128 11:56:06.948793   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/custom-flannel-113537/client.crt: no such file or directory
E1128 11:56:07.029231   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/custom-flannel-113537/client.crt: no such file or directory
E1128 11:56:07.190061   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/custom-flannel-113537/client.crt: no such file or directory
E1128 11:56:07.510630   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/custom-flannel-113537/client.crt: no such file or directory
E1128 11:56:08.152346   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/custom-flannel-113537/client.crt: no such file or directory
E1128 11:56:09.433339   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/custom-flannel-113537/client.crt: no such file or directory
net_test.go:238: (dbg) Non-zero exit: kubectl --context kubenet-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080": exit status 1 (5.102573812s)

                                                
                                                
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
net_test.go:238: (dbg) Run:  kubectl --context kubenet-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
E1128 11:56:11.993793   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/custom-flannel-113537/client.crt: no such file or directory
net_test.go:238: (dbg) Non-zero exit: kubectl --context kubenet-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080": exit status 1 (5.103896398s)

                                                
                                                
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
E1128 11:56:17.115198   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/custom-flannel-113537/client.crt: no such file or directory
net_test.go:238: (dbg) Run:  kubectl --context kubenet-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
net_test.go:238: (dbg) Non-zero exit: kubectl --context kubenet-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080": exit status 1 (5.106507588s)

                                                
                                                
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
net_test.go:238: (dbg) Run:  kubectl --context kubenet-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
E1128 11:56:27.356338   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/custom-flannel-113537/client.crt: no such file or directory
net_test.go:238: (dbg) Non-zero exit: kubectl --context kubenet-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080": exit status 1 (5.097833529s)

                                                
                                                
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
net_test.go:238: (dbg) Run:  kubectl --context kubenet-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
net_test.go:238: (dbg) Non-zero exit: kubectl --context kubenet-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080": exit status 1 (5.111459606s)

                                                
                                                
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
E1128 11:56:40.938470   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/cilium-113537/client.crt: no such file or directory
net_test.go:238: (dbg) Run:  kubectl --context kubenet-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
E1128 11:56:44.925068   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/skaffold-113417/client.crt: no such file or directory
E1128 11:56:46.464319   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
E1128 11:56:47.861840   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/custom-flannel-113537/client.crt: no such file or directory
net_test.go:238: (dbg) Non-zero exit: kubectl --context kubenet-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080": exit status 1 (5.117032533s)

                                                
                                                
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
net_test.go:238: (dbg) Run:  kubectl --context kubenet-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
E1128 11:56:54.988252   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/auto-113537/client.crt: no such file or directory
net_test.go:238: (dbg) Non-zero exit: kubectl --context kubenet-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080": exit status 1 (5.107323265s)

                                                
                                                
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
net_test.go:243: failed to connect via pod host: exit status 1
--- FAIL: TestNetworkPlugins/group/kubenet/HairPin (53.96s)
E1128 12:13:20.700222   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
E1128 12:13:22.749512   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/no-preload-115704/client.crt: no such file or directory
E1128 12:13:24.981329   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/skaffold-113417/client.crt: no such file or directory
E1128 12:13:40.077288   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/enable-default-cni-113537/client.crt: no such file or directory
E1128 12:13:43.697684   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/false-113537/client.crt: no such file or directory
E1128 12:13:48.113865   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/old-k8s-version-115518/client.crt: no such file or directory
E1128 12:13:50.440545   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/no-preload-115704/client.crt: no such file or directory
E1128 12:13:54.234636   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
E1128 12:13:57.089702   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/cilium-113537/client.crt: no such file or directory
E1128 12:14:25.188584   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/calico-113537/client.crt: no such file or directory
E1128 12:14:26.578789   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/flannel-113537/client.crt: no such file or directory
E1128 12:14:58.680427   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/bridge-113537/client.crt: no such file or directory
E1128 12:15:03.126578   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/enable-default-cni-113537/client.crt: no such file or directory
E1128 12:15:21.918883   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/skaffold-113417/client.crt: no such file or directory

                                                
                                    

Test pass (283/301)

Order passed test Duration
3 TestDownloadOnly/v1.16.0/json-events 9.71
4 TestDownloadOnly/v1.16.0/preload-exists 0
7 TestDownloadOnly/v1.16.0/kubectl 0
8 TestDownloadOnly/v1.16.0/LogsDuration 0.29
10 TestDownloadOnly/v1.25.3/json-events 6.68
11 TestDownloadOnly/v1.25.3/preload-exists 0
14 TestDownloadOnly/v1.25.3/kubectl 0
15 TestDownloadOnly/v1.25.3/LogsDuration 0.33
16 TestDownloadOnly/DeleteAll 0.41
17 TestDownloadOnly/DeleteAlwaysSucceeds 0.39
19 TestBinaryMirror 0.97
20 TestOffline 61.57
22 TestAddons/Setup 223.84
24 TestAddons/parallel/Registry 22.42
25 TestAddons/parallel/Ingress 20.71
26 TestAddons/parallel/MetricsServer 5.62
27 TestAddons/parallel/HelmTiller 11.78
29 TestAddons/parallel/CSI 47.85
30 TestAddons/parallel/Headlamp 12.93
31 TestAddons/parallel/CloudSpanner 5.34
33 TestAddons/serial/GCPAuth 18.39
34 TestAddons/StoppedEnableDisable 2.58
35 TestCertOptions 45.47
36 TestCertExpiration 253.23
37 TestDockerFlags 54.71
38 TestForceSystemdFlag 44.93
39 TestForceSystemdEnv 43.28
41 TestHyperKitDriverInstallOrUpdate 8.83
44 TestErrorSpam/setup 41.44
45 TestErrorSpam/start 1.3
46 TestErrorSpam/status 0.5
47 TestErrorSpam/pause 1.31
48 TestErrorSpam/unpause 1.31
49 TestErrorSpam/stop 8.68
52 TestFunctional/serial/CopySyncFile 0
53 TestFunctional/serial/StartWithProxy 63.38
54 TestFunctional/serial/AuditLog 0
55 TestFunctional/serial/SoftStart 40.52
56 TestFunctional/serial/KubeContext 0.03
57 TestFunctional/serial/KubectlGetPods 0.07
60 TestFunctional/serial/CacheCmd/cache/add_remote 14.75
61 TestFunctional/serial/CacheCmd/cache/add_local 1.49
62 TestFunctional/serial/CacheCmd/cache/delete_k8s.gcr.io/pause:3.3 0.08
63 TestFunctional/serial/CacheCmd/cache/list 0.08
64 TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node 0.17
65 TestFunctional/serial/CacheCmd/cache/cache_reload 3.07
66 TestFunctional/serial/CacheCmd/cache/delete 0.16
67 TestFunctional/serial/MinikubeKubectlCmd 0.49
68 TestFunctional/serial/MinikubeKubectlCmdDirectly 0.65
69 TestFunctional/serial/ExtraConfig 42.66
70 TestFunctional/serial/ComponentHealth 0.05
71 TestFunctional/serial/LogsCmd 2.52
72 TestFunctional/serial/LogsFileCmd 2.66
74 TestFunctional/parallel/ConfigCmd 0.51
75 TestFunctional/parallel/DashboardCmd 13.09
76 TestFunctional/parallel/DryRun 1.03
77 TestFunctional/parallel/InternationalLanguage 0.46
78 TestFunctional/parallel/StatusCmd 0.52
81 TestFunctional/parallel/ServiceCmd 9.23
82 TestFunctional/parallel/ServiceCmdConnect 13.35
83 TestFunctional/parallel/AddonsCmd 0.28
84 TestFunctional/parallel/PersistentVolumeClaim 28.64
86 TestFunctional/parallel/SSHCmd 0.34
87 TestFunctional/parallel/CpCmd 0.63
88 TestFunctional/parallel/MySQL 22.29
89 TestFunctional/parallel/FileSync 0.16
90 TestFunctional/parallel/CertSync 1.05
94 TestFunctional/parallel/NodeLabels 0.06
96 TestFunctional/parallel/NonActiveRuntimeDisabled 0.2
98 TestFunctional/parallel/License 0.46
100 TestFunctional/parallel/TunnelCmd/serial/StartTunnel 0.02
102 TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup 11.15
103 TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP 0.05
104 TestFunctional/parallel/TunnelCmd/serial/AccessDirect 0.02
105 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig 0.02
106 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil 0.02
107 TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS 0.02
108 TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel 0.13
109 TestFunctional/parallel/ProfileCmd/profile_not_create 0.32
110 TestFunctional/parallel/ProfileCmd/profile_list 0.29
111 TestFunctional/parallel/ProfileCmd/profile_json_output 0.29
112 TestFunctional/parallel/MountCmd/any-port 12.29
113 TestFunctional/parallel/MountCmd/specific-port 1.61
114 TestFunctional/parallel/Version/short 0.14
115 TestFunctional/parallel/Version/components 0.45
116 TestFunctional/parallel/ImageCommands/ImageListShort 0.18
117 TestFunctional/parallel/ImageCommands/ImageListTable 0.16
118 TestFunctional/parallel/ImageCommands/ImageListJson 0.16
119 TestFunctional/parallel/ImageCommands/ImageListYaml 0.23
120 TestFunctional/parallel/ImageCommands/ImageBuild 6.7
121 TestFunctional/parallel/ImageCommands/Setup 5.53
122 TestFunctional/parallel/ImageCommands/ImageLoadDaemon 3.23
123 TestFunctional/parallel/DockerEnv/bash 0.74
124 TestFunctional/parallel/UpdateContextCmd/no_changes 0.19
125 TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster 0.2
126 TestFunctional/parallel/UpdateContextCmd/no_clusters 0.21
127 TestFunctional/parallel/ImageCommands/ImageReloadDaemon 2.07
128 TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon 8.73
129 TestFunctional/parallel/ImageCommands/ImageSaveToFile 0.92
130 TestFunctional/parallel/ImageCommands/ImageRemove 0.36
131 TestFunctional/parallel/ImageCommands/ImageLoadFromFile 1.11
132 TestFunctional/parallel/ImageCommands/ImageSaveDaemon 2
133 TestFunctional/delete_addon-resizer_images 0.15
134 TestFunctional/delete_my-image_image 0.06
135 TestFunctional/delete_minikube_cached_images 0.06
138 TestIngressAddonLegacy/StartLegacyK8sCluster 75.01
140 TestIngressAddonLegacy/serial/ValidateIngressAddonActivation 14.74
141 TestIngressAddonLegacy/serial/ValidateIngressDNSAddonActivation 0.49
142 TestIngressAddonLegacy/serial/ValidateIngressAddons 38.1
145 TestJSONOutput/start/Command 59.22
146 TestJSONOutput/start/Audit 0
148 TestJSONOutput/start/parallel/DistinctCurrentSteps 0
149 TestJSONOutput/start/parallel/IncreasingCurrentSteps 0
151 TestJSONOutput/pause/Command 0.48
152 TestJSONOutput/pause/Audit 0
154 TestJSONOutput/pause/parallel/DistinctCurrentSteps 0
155 TestJSONOutput/pause/parallel/IncreasingCurrentSteps 0
157 TestJSONOutput/unpause/Command 0.48
158 TestJSONOutput/unpause/Audit 0
160 TestJSONOutput/unpause/parallel/DistinctCurrentSteps 0
161 TestJSONOutput/unpause/parallel/IncreasingCurrentSteps 0
163 TestJSONOutput/stop/Command 8.16
164 TestJSONOutput/stop/Audit 0
166 TestJSONOutput/stop/parallel/DistinctCurrentSteps 0
167 TestJSONOutput/stop/parallel/IncreasingCurrentSteps 0
168 TestErrorJSONOutput 0.74
172 TestMainNoArgs 0.08
173 TestMinikubeProfile 102.2
176 TestMountStart/serial/StartWithMountFirst 17.37
177 TestMountStart/serial/VerifyMountFirst 0.31
178 TestMountStart/serial/StartWithMountSecond 17.17
179 TestMountStart/serial/VerifyMountSecond 0.29
180 TestMountStart/serial/DeleteFirst 2.39
181 TestMountStart/serial/VerifyMountPostDelete 0.28
182 TestMountStart/serial/Stop 2.24
183 TestMountStart/serial/RestartStopped 16.63
184 TestMountStart/serial/VerifyMountPostStop 0.31
187 TestMultiNode/serial/FreshStart2Nodes 145.42
188 TestMultiNode/serial/DeployApp2Nodes 9.6
189 TestMultiNode/serial/PingHostFrom2Pods 0.86
190 TestMultiNode/serial/AddNode 44.17
191 TestMultiNode/serial/ProfileList 0.21
192 TestMultiNode/serial/CopyFile 5.36
193 TestMultiNode/serial/StopNode 2.67
194 TestMultiNode/serial/StartAfterStop 30.67
195 TestMultiNode/serial/RestartKeepsNodes 861.15
196 TestMultiNode/serial/DeleteNode 4.96
197 TestMultiNode/serial/StopMultiNode 4.47
198 TestMultiNode/serial/RestartMultiNode 555.37
199 TestMultiNode/serial/ValidateNameConflict 48.19
203 TestPreload 147.54
205 TestScheduledStopUnix 111.25
206 TestSkaffold 79.1
209 TestRunningBinaryUpgrade 171.51
211 TestKubernetesUpgrade 167.34
224 TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current 3.73
225 TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current 6.7
233 TestStoppedBinaryUpgrade/Setup 0.7
234 TestStoppedBinaryUpgrade/Upgrade 184.4
235 TestStoppedBinaryUpgrade/MinikubeLogs 3.08
237 TestNoKubernetes/serial/StartNoK8sWithVersion 0.48
238 TestNoKubernetes/serial/StartWithK8s 41.95
240 TestPause/serial/Start 61
241 TestNoKubernetes/serial/StartWithStopK8s 16.59
242 TestNoKubernetes/serial/Start 14.86
244 TestNoKubernetes/serial/VerifyK8sNotRunning 0.13
245 TestNoKubernetes/serial/ProfileList 2.75
246 TestNoKubernetes/serial/Stop 2.21
247 TestNoKubernetes/serial/StartNoArgs 14.25
248 TestNoKubernetes/serial/VerifyK8sNotRunningSecond 0.12
249 TestNetworkPlugins/group/auto/Start 58.28
250 TestNetworkPlugins/group/kindnet/Start 63.29
251 TestNetworkPlugins/group/auto/KubeletFlags 0.16
252 TestNetworkPlugins/group/auto/NetCatPod 14.18
253 TestNetworkPlugins/group/auto/DNS 0.12
254 TestNetworkPlugins/group/auto/Localhost 0.1
255 TestNetworkPlugins/group/auto/HairPin 5.11
256 TestNetworkPlugins/group/cilium/Start 97.39
257 TestNetworkPlugins/group/kindnet/ControllerPod 5.01
258 TestNetworkPlugins/group/kindnet/KubeletFlags 0.15
259 TestNetworkPlugins/group/kindnet/NetCatPod 14.23
260 TestNetworkPlugins/group/kindnet/DNS 0.15
261 TestNetworkPlugins/group/kindnet/Localhost 0.11
262 TestNetworkPlugins/group/kindnet/HairPin 0.11
263 TestNetworkPlugins/group/calico/Start 311.05
264 TestNetworkPlugins/group/cilium/ControllerPod 5.01
265 TestNetworkPlugins/group/cilium/KubeletFlags 0.15
266 TestNetworkPlugins/group/cilium/NetCatPod 15.65
267 TestNetworkPlugins/group/cilium/DNS 0.14
268 TestNetworkPlugins/group/cilium/Localhost 0.1
269 TestNetworkPlugins/group/cilium/HairPin 0.1
270 TestNetworkPlugins/group/custom-flannel/Start 103.1
271 TestNetworkPlugins/group/custom-flannel/KubeletFlags 0.16
272 TestNetworkPlugins/group/custom-flannel/NetCatPod 15.19
273 TestNetworkPlugins/group/custom-flannel/DNS 0.11
274 TestNetworkPlugins/group/custom-flannel/Localhost 0.11
275 TestNetworkPlugins/group/custom-flannel/HairPin 0.1
276 TestNetworkPlugins/group/false/Start 54.65
277 TestNetworkPlugins/group/false/KubeletFlags 0.14
278 TestNetworkPlugins/group/false/NetCatPod 14.19
279 TestNetworkPlugins/group/false/DNS 0.12
280 TestNetworkPlugins/group/false/Localhost 0.1
281 TestNetworkPlugins/group/false/HairPin 5.11
282 TestNetworkPlugins/group/enable-default-cni/Start 54.48
283 TestNetworkPlugins/group/calico/ControllerPod 5.01
284 TestNetworkPlugins/group/calico/KubeletFlags 0.15
285 TestNetworkPlugins/group/calico/NetCatPod 15.31
286 TestNetworkPlugins/group/calico/DNS 0.15
287 TestNetworkPlugins/group/calico/Localhost 0.1
288 TestNetworkPlugins/group/calico/HairPin 0.11
289 TestNetworkPlugins/group/flannel/Start 58.38
290 TestNetworkPlugins/group/enable-default-cni/KubeletFlags 0.15
291 TestNetworkPlugins/group/enable-default-cni/NetCatPod 15.19
292 TestNetworkPlugins/group/enable-default-cni/DNS 0.11
293 TestNetworkPlugins/group/enable-default-cni/Localhost 0.1
294 TestNetworkPlugins/group/enable-default-cni/HairPin 0.1
295 TestNetworkPlugins/group/bridge/Start 59.47
296 TestNetworkPlugins/group/flannel/ControllerPod 8.01
297 TestNetworkPlugins/group/flannel/KubeletFlags 0.16
298 TestNetworkPlugins/group/flannel/NetCatPod 15.22
299 TestNetworkPlugins/group/flannel/DNS 0.11
300 TestNetworkPlugins/group/flannel/Localhost 0.1
301 TestNetworkPlugins/group/flannel/HairPin 0.1
302 TestNetworkPlugins/group/kubenet/Start 56.75
303 TestNetworkPlugins/group/bridge/KubeletFlags 0.15
304 TestNetworkPlugins/group/bridge/NetCatPod 14.21
305 TestNetworkPlugins/group/bridge/DNS 0.12
306 TestNetworkPlugins/group/bridge/Localhost 0.11
307 TestNetworkPlugins/group/bridge/HairPin 0.1
309 TestStartStop/group/old-k8s-version/serial/FirstStart 345.77
310 TestNetworkPlugins/group/kubenet/KubeletFlags 0.15
311 TestNetworkPlugins/group/kubenet/NetCatPod 14.19
312 TestNetworkPlugins/group/kubenet/DNS 0.11
313 TestNetworkPlugins/group/kubenet/Localhost 0.1
316 TestStartStop/group/no-preload/serial/FirstStart 77.6
317 TestStartStop/group/no-preload/serial/DeployApp 13.27
318 TestStartStop/group/no-preload/serial/EnableAddonWhileActive 0.68
319 TestStartStop/group/no-preload/serial/Stop 8.26
320 TestStartStop/group/no-preload/serial/EnableAddonAfterStop 0.3
321 TestStartStop/group/no-preload/serial/SecondStart 315.91
322 TestStartStop/group/old-k8s-version/serial/DeployApp 12.29
323 TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive 0.64
324 TestStartStop/group/old-k8s-version/serial/Stop 2.24
325 TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop 0.3
326 TestStartStop/group/old-k8s-version/serial/SecondStart 453.79
327 TestStartStop/group/no-preload/serial/UserAppExistsAfterStop 19.01
328 TestStartStop/group/no-preload/serial/AddonExistsAfterStop 5.06
329 TestStartStop/group/no-preload/serial/VerifyKubernetesImages 0.17
330 TestStartStop/group/no-preload/serial/Pause 1.89
332 TestStartStop/group/embed-certs/serial/FirstStart 54.27
333 TestStartStop/group/embed-certs/serial/DeployApp 13.26
334 TestStartStop/group/embed-certs/serial/EnableAddonWhileActive 0.66
335 TestStartStop/group/embed-certs/serial/Stop 8.23
336 TestStartStop/group/embed-certs/serial/EnableAddonAfterStop 0.39
337 TestStartStop/group/embed-certs/serial/SecondStart 315.59
338 TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop 5.01
339 TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop 5.06
340 TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages 0.16
341 TestStartStop/group/old-k8s-version/serial/Pause 1.77
343 TestStartStop/group/default-k8s-diff-port/serial/FirstStart 56.52
344 TestStartStop/group/default-k8s-diff-port/serial/DeployApp 13.26
345 TestStartStop/group/default-k8s-diff-port/serial/EnableAddonWhileActive 0.64
346 TestStartStop/group/default-k8s-diff-port/serial/Stop 3.22
347 TestStartStop/group/default-k8s-diff-port/serial/EnableAddonAfterStop 0.3
348 TestStartStop/group/default-k8s-diff-port/serial/SecondStart 311.99
349 TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop 9.01
350 TestStartStop/group/embed-certs/serial/AddonExistsAfterStop 5.06
351 TestStartStop/group/embed-certs/serial/VerifyKubernetesImages 0.17
352 TestStartStop/group/embed-certs/serial/Pause 1.91
354 TestStartStop/group/newest-cni/serial/FirstStart 52.95
355 TestStartStop/group/newest-cni/serial/DeployApp 0
356 TestStartStop/group/newest-cni/serial/EnableAddonWhileActive 1.21
357 TestStartStop/group/newest-cni/serial/Stop 8.25
358 TestStartStop/group/newest-cni/serial/EnableAddonAfterStop 0.3
359 TestStartStop/group/newest-cni/serial/SecondStart 31.7
360 TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop 0
361 TestStartStop/group/newest-cni/serial/AddonExistsAfterStop 0
362 TestStartStop/group/newest-cni/serial/VerifyKubernetesImages 0.17
363 TestStartStop/group/newest-cni/serial/Pause 1.81
364 TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop 12.01
365 TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop 5.06
366 TestStartStop/group/default-k8s-diff-port/serial/VerifyKubernetesImages 0.17
367 TestStartStop/group/default-k8s-diff-port/serial/Pause 1.84
x
+
TestDownloadOnly/v1.16.0/json-events (9.71s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/json-events
aaa_download_only_test.go:71: (dbg) Run:  out/minikube-darwin-amd64 start -o=json --download-only -p download-only-104417 --force --alsologtostderr --kubernetes-version=v1.16.0 --container-runtime=docker --driver=hyperkit 
aaa_download_only_test.go:71: (dbg) Done: out/minikube-darwin-amd64 start -o=json --download-only -p download-only-104417 --force --alsologtostderr --kubernetes-version=v1.16.0 --container-runtime=docker --driver=hyperkit : (9.710648501s)
--- PASS: TestDownloadOnly/v1.16.0/json-events (9.71s)

                                                
                                    
x
+
TestDownloadOnly/v1.16.0/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/preload-exists
--- PASS: TestDownloadOnly/v1.16.0/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.16.0/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/kubectl
--- PASS: TestDownloadOnly/v1.16.0/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.16.0/LogsDuration (0.29s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/LogsDuration
aaa_download_only_test.go:173: (dbg) Run:  out/minikube-darwin-amd64 logs -p download-only-104417
aaa_download_only_test.go:173: (dbg) Non-zero exit: out/minikube-darwin-amd64 logs -p download-only-104417: exit status 85 (288.531313ms)

                                                
                                                
-- stdout --
	* 
	* ==> Audit <==
	* |---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	| Command |              Args              |       Profile        |  User   | Version |     Start Time      | End Time |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	| start   | -o=json --download-only        | download-only-104417 | jenkins | v1.28.0 | 28 Nov 22 10:44 PST |          |
	|         | -p download-only-104417        |                      |         |         |                     |          |
	|         | --force --alsologtostderr      |                      |         |         |                     |          |
	|         | --kubernetes-version=v1.16.0   |                      |         |         |                     |          |
	|         | --container-runtime=docker     |                      |         |         |                     |          |
	|         | --driver=hyperkit              |                      |         |         |                     |          |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	
	* 
	* ==> Last Start <==
	* Log file created at: 2022/11/28 10:44:17
	Running on machine: MacOS-Agent-2
	Binary: Built with gc go1.19.3 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1128 10:44:17.278430   15825 out.go:296] Setting OutFile to fd 1 ...
	I1128 10:44:17.278614   15825 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1128 10:44:17.278619   15825 out.go:309] Setting ErrFile to fd 2...
	I1128 10:44:17.278623   15825 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1128 10:44:17.278743   15825 root.go:334] Updating PATH: /Users/jenkins/minikube-integration/15411-14646/.minikube/bin
	W1128 10:44:17.278849   15825 root.go:311] Error reading config file at /Users/jenkins/minikube-integration/15411-14646/.minikube/config/config.json: open /Users/jenkins/minikube-integration/15411-14646/.minikube/config/config.json: no such file or directory
	I1128 10:44:17.279616   15825 out.go:303] Setting JSON to true
	I1128 10:44:17.300634   15825 start.go:124] hostinfo: {"hostname":"MacOS-Agent-2.local","uptime":6232,"bootTime":1669654825,"procs":393,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"13.0.1","kernelVersion":"22.1.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"2965c349-98a5-5970-aaa9-9eedd3ae5959"}
	W1128 10:44:17.300772   15825 start.go:132] gopshost.Virtualization returned error: not implemented yet
	I1128 10:44:17.322942   15825 out.go:97] [download-only-104417] minikube v1.28.0 on Darwin 13.0.1
	I1128 10:44:17.323143   15825 notify.go:220] Checking for updates...
	W1128 10:44:17.323187   15825 preload.go:295] Failed to list preload files: open /Users/jenkins/minikube-integration/15411-14646/.minikube/cache/preloaded-tarball: no such file or directory
	I1128 10:44:17.344743   15825 out.go:169] MINIKUBE_LOCATION=15411
	I1128 10:44:17.365727   15825 out.go:169] KUBECONFIG=/Users/jenkins/minikube-integration/15411-14646/kubeconfig
	I1128 10:44:17.388015   15825 out.go:169] MINIKUBE_BIN=out/minikube-darwin-amd64
	I1128 10:44:17.409945   15825 out.go:169] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1128 10:44:17.431899   15825 out.go:169] MINIKUBE_HOME=/Users/jenkins/minikube-integration/15411-14646/.minikube
	W1128 10:44:17.474835   15825 out.go:272] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I1128 10:44:17.475253   15825 driver.go:365] Setting default libvirt URI to qemu:///system
	I1128 10:44:17.503722   15825 out.go:97] Using the hyperkit driver based on user configuration
	I1128 10:44:17.503845   15825 start.go:293] selected driver: hyperkit
	I1128 10:44:17.503865   15825 start.go:837] validating driver "hyperkit" against <nil>
	I1128 10:44:17.503983   15825 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1128 10:44:17.504231   15825 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/15411-14646/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I1128 10:44:17.641276   15825 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.28.0
	I1128 10:44:17.644621   15825 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 10:44:17.644638   15825 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I1128 10:44:17.644674   15825 start_flags.go:303] no existing cluster config was found, will generate one from the flags 
	I1128 10:44:17.647607   15825 start_flags.go:384] Using suggested 6000MB memory alloc based on sys=32768MB, container=0MB
	I1128 10:44:17.647712   15825 start_flags.go:892] Wait components to verify : map[apiserver:true system_pods:true]
	I1128 10:44:17.647738   15825 cni.go:95] Creating CNI manager for ""
	I1128 10:44:17.647747   15825 cni.go:169] CNI unnecessary in this configuration, recommending no CNI
	I1128 10:44:17.647757   15825 start_flags.go:317] config:
	{Name:download-only-104417 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.36-1668787669-15272@sha256:06094fc04b5dc02fbf1e2de7723c2a6db5d24c21fd2ddda91f6daaf29038cd9c Memory:6000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.16.0 ClusterName:download-only-104417 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Container
Runtime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath:/opt/socket_vmnet/bin/socket_vmnet_client SocketVMnetPath:/var/run/socket_vmnet}
	I1128 10:44:17.647978   15825 iso.go:125] acquiring lock: {Name:mkf8786ebc65c7c4a918cffd312ffffda2a4bd0b Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1128 10:44:17.669247   15825 out.go:97] Downloading VM boot image ...
	I1128 10:44:17.669413   15825 download.go:101] Downloading: https://storage.googleapis.com/minikube-builds/iso/15235/minikube-v1.28.0-1668700269-15235-amd64.iso?checksum=file:https://storage.googleapis.com/minikube-builds/iso/15235/minikube-v1.28.0-1668700269-15235-amd64.iso.sha256 -> /Users/jenkins/minikube-integration/15411-14646/.minikube/cache/iso/amd64/minikube-v1.28.0-1668700269-15235-amd64.iso
	I1128 10:44:21.217726   15825 out.go:97] Starting control plane node download-only-104417 in cluster download-only-104417
	I1128 10:44:21.217826   15825 preload.go:132] Checking if preload exists for k8s version v1.16.0 and runtime docker
	I1128 10:44:21.273855   15825 preload.go:119] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.16.0/preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4
	I1128 10:44:21.273888   15825 cache.go:57] Caching tarball of preloaded images
	I1128 10:44:21.274232   15825 preload.go:132] Checking if preload exists for k8s version v1.16.0 and runtime docker
	I1128 10:44:21.295651   15825 out.go:97] Downloading Kubernetes v1.16.0 preload ...
	I1128 10:44:21.295692   15825 preload.go:238] getting checksum for preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4 ...
	I1128 10:44:21.378817   15825 download.go:101] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.16.0/preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4?checksum=md5:326f3ce331abb64565b50b8c9e791244 -> /Users/jenkins/minikube-integration/15411-14646/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4
	
	* 
	* The control plane node "" does not exist.
	  To start a cluster, run: "minikube start -p download-only-104417"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:174: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.16.0/LogsDuration (0.29s)

                                                
                                    
x
+
TestDownloadOnly/v1.25.3/json-events (6.68s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.25.3/json-events
aaa_download_only_test.go:71: (dbg) Run:  out/minikube-darwin-amd64 start -o=json --download-only -p download-only-104417 --force --alsologtostderr --kubernetes-version=v1.25.3 --container-runtime=docker --driver=hyperkit 
aaa_download_only_test.go:71: (dbg) Done: out/minikube-darwin-amd64 start -o=json --download-only -p download-only-104417 --force --alsologtostderr --kubernetes-version=v1.25.3 --container-runtime=docker --driver=hyperkit : (6.682707102s)
--- PASS: TestDownloadOnly/v1.25.3/json-events (6.68s)

                                                
                                    
x
+
TestDownloadOnly/v1.25.3/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.25.3/preload-exists
--- PASS: TestDownloadOnly/v1.25.3/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.25.3/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.25.3/kubectl
--- PASS: TestDownloadOnly/v1.25.3/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.25.3/LogsDuration (0.33s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.25.3/LogsDuration
aaa_download_only_test.go:173: (dbg) Run:  out/minikube-darwin-amd64 logs -p download-only-104417
aaa_download_only_test.go:173: (dbg) Non-zero exit: out/minikube-darwin-amd64 logs -p download-only-104417: exit status 85 (332.01063ms)

                                                
                                                
-- stdout --
	* 
	* ==> Audit <==
	* |---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	| Command |              Args              |       Profile        |  User   | Version |     Start Time      | End Time |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	| start   | -o=json --download-only        | download-only-104417 | jenkins | v1.28.0 | 28 Nov 22 10:44 PST |          |
	|         | -p download-only-104417        |                      |         |         |                     |          |
	|         | --force --alsologtostderr      |                      |         |         |                     |          |
	|         | --kubernetes-version=v1.16.0   |                      |         |         |                     |          |
	|         | --container-runtime=docker     |                      |         |         |                     |          |
	|         | --driver=hyperkit              |                      |         |         |                     |          |
	| start   | -o=json --download-only        | download-only-104417 | jenkins | v1.28.0 | 28 Nov 22 10:44 PST |          |
	|         | -p download-only-104417        |                      |         |         |                     |          |
	|         | --force --alsologtostderr      |                      |         |         |                     |          |
	|         | --kubernetes-version=v1.25.3   |                      |         |         |                     |          |
	|         | --container-runtime=docker     |                      |         |         |                     |          |
	|         | --driver=hyperkit              |                      |         |         |                     |          |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	
	* 
	* ==> Last Start <==
	* Log file created at: 2022/11/28 10:44:27
	Running on machine: MacOS-Agent-2
	Binary: Built with gc go1.19.3 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1128 10:44:27.279545   15847 out.go:296] Setting OutFile to fd 1 ...
	I1128 10:44:27.279714   15847 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1128 10:44:27.279720   15847 out.go:309] Setting ErrFile to fd 2...
	I1128 10:44:27.279724   15847 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1128 10:44:27.279825   15847 root.go:334] Updating PATH: /Users/jenkins/minikube-integration/15411-14646/.minikube/bin
	W1128 10:44:27.279933   15847 root.go:311] Error reading config file at /Users/jenkins/minikube-integration/15411-14646/.minikube/config/config.json: open /Users/jenkins/minikube-integration/15411-14646/.minikube/config/config.json: no such file or directory
	I1128 10:44:27.280321   15847 out.go:303] Setting JSON to true
	I1128 10:44:27.300496   15847 start.go:124] hostinfo: {"hostname":"MacOS-Agent-2.local","uptime":6242,"bootTime":1669654825,"procs":394,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"13.0.1","kernelVersion":"22.1.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"2965c349-98a5-5970-aaa9-9eedd3ae5959"}
	W1128 10:44:27.300581   15847 start.go:132] gopshost.Virtualization returned error: not implemented yet
	I1128 10:44:27.322420   15847 out.go:97] [download-only-104417] minikube v1.28.0 on Darwin 13.0.1
	I1128 10:44:27.322504   15847 notify.go:220] Checking for updates...
	I1128 10:44:27.343498   15847 out.go:169] MINIKUBE_LOCATION=15411
	I1128 10:44:27.364683   15847 out.go:169] KUBECONFIG=/Users/jenkins/minikube-integration/15411-14646/kubeconfig
	I1128 10:44:27.385723   15847 out.go:169] MINIKUBE_BIN=out/minikube-darwin-amd64
	I1128 10:44:27.407827   15847 out.go:169] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1128 10:44:27.429876   15847 out.go:169] MINIKUBE_HOME=/Users/jenkins/minikube-integration/15411-14646/.minikube
	W1128 10:44:27.472659   15847 out.go:272] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I1128 10:44:27.473362   15847 config.go:180] Loaded profile config "download-only-104417": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.16.0
	W1128 10:44:27.473455   15847 start.go:745] api.Load failed for download-only-104417: filestore "download-only-104417": Docker machine "download-only-104417" does not exist. Use "docker-machine ls" to list machines. Use "docker-machine create" to add a new one.
	I1128 10:44:27.473536   15847 driver.go:365] Setting default libvirt URI to qemu:///system
	W1128 10:44:27.473576   15847 start.go:745] api.Load failed for download-only-104417: filestore "download-only-104417": Docker machine "download-only-104417" does not exist. Use "docker-machine ls" to list machines. Use "docker-machine create" to add a new one.
	I1128 10:44:27.502697   15847 out.go:97] Using the hyperkit driver based on existing profile
	I1128 10:44:27.502805   15847 start.go:293] selected driver: hyperkit
	I1128 10:44:27.502821   15847 start.go:837] validating driver "hyperkit" against &{Name:download-only-104417 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/15235/minikube-v1.28.0-1668700269-15235-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.36-1668787669-15272@sha256:06094fc04b5dc02fbf1e2de7723c2a6db5d24c21fd2ddda91f6daaf29038cd9c Memory:6000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 Kuber
netesConfig:{KubernetesVersion:v1.16.0 ClusterName:download-only-104417 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.16.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirro
r: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath:/opt/socket_vmnet/bin/socket_vmnet_client SocketVMnetPath:/var/run/socket_vmnet}
	I1128 10:44:27.503096   15847 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1128 10:44:27.503311   15847 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/15411-14646/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I1128 10:44:27.511948   15847 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.28.0
	I1128 10:44:27.515272   15847 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 10:44:27.515291   15847 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I1128 10:44:27.517449   15847 cni.go:95] Creating CNI manager for ""
	I1128 10:44:27.517465   15847 cni.go:169] CNI unnecessary in this configuration, recommending no CNI
	I1128 10:44:27.517487   15847 start_flags.go:317] config:
	{Name:download-only-104417 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/15235/minikube-v1.28.0-1668700269-15235-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.36-1668787669-15272@sha256:06094fc04b5dc02fbf1e2de7723c2a6db5d24c21fd2ddda91f6daaf29038cd9c Memory:6000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.25.3 ClusterName:download-only-104417 Namespace:
default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.16.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketV
MnetClientPath:/opt/socket_vmnet/bin/socket_vmnet_client SocketVMnetPath:/var/run/socket_vmnet}
	I1128 10:44:27.517611   15847 iso.go:125] acquiring lock: {Name:mkf8786ebc65c7c4a918cffd312ffffda2a4bd0b Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1128 10:44:27.538444   15847 out.go:97] Starting control plane node download-only-104417 in cluster download-only-104417
	I1128 10:44:27.538571   15847 preload.go:132] Checking if preload exists for k8s version v1.25.3 and runtime docker
	I1128 10:44:27.593939   15847 preload.go:119] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.25.3/preloaded-images-k8s-v18-v1.25.3-docker-overlay2-amd64.tar.lz4
	I1128 10:44:27.593980   15847 cache.go:57] Caching tarball of preloaded images
	I1128 10:44:27.594388   15847 preload.go:132] Checking if preload exists for k8s version v1.25.3 and runtime docker
	I1128 10:44:27.615906   15847 out.go:97] Downloading Kubernetes v1.25.3 preload ...
	I1128 10:44:27.616001   15847 preload.go:238] getting checksum for preloaded-images-k8s-v18-v1.25.3-docker-overlay2-amd64.tar.lz4 ...
	I1128 10:44:27.692758   15847 download.go:101] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.25.3/preloaded-images-k8s-v18-v1.25.3-docker-overlay2-amd64.tar.lz4?checksum=md5:624cb874287e7e3d793b79e4205a7f98 -> /Users/jenkins/minikube-integration/15411-14646/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.25.3-docker-overlay2-amd64.tar.lz4
	I1128 10:44:32.172927   15847 preload.go:249] saving checksum for preloaded-images-k8s-v18-v1.25.3-docker-overlay2-amd64.tar.lz4 ...
	I1128 10:44:32.173138   15847 preload.go:256] verifying checksum of /Users/jenkins/minikube-integration/15411-14646/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.25.3-docker-overlay2-amd64.tar.lz4 ...
	
	* 
	* The control plane node "" does not exist.
	  To start a cluster, run: "minikube start -p download-only-104417"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:174: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.25.3/LogsDuration (0.33s)

                                                
                                    
x
+
TestDownloadOnly/DeleteAll (0.41s)

                                                
                                                
=== RUN   TestDownloadOnly/DeleteAll
aaa_download_only_test.go:191: (dbg) Run:  out/minikube-darwin-amd64 delete --all
--- PASS: TestDownloadOnly/DeleteAll (0.41s)

                                                
                                    
x
+
TestDownloadOnly/DeleteAlwaysSucceeds (0.39s)

                                                
                                                
=== RUN   TestDownloadOnly/DeleteAlwaysSucceeds
aaa_download_only_test.go:203: (dbg) Run:  out/minikube-darwin-amd64 delete -p download-only-104417
--- PASS: TestDownloadOnly/DeleteAlwaysSucceeds (0.39s)

                                                
                                    
x
+
TestBinaryMirror (0.97s)

                                                
                                                
=== RUN   TestBinaryMirror
aaa_download_only_test.go:310: (dbg) Run:  out/minikube-darwin-amd64 start --download-only -p binary-mirror-104435 --alsologtostderr --binary-mirror http://127.0.0.1:54979 --driver=hyperkit 
helpers_test.go:175: Cleaning up "binary-mirror-104435" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p binary-mirror-104435
--- PASS: TestBinaryMirror (0.97s)

                                                
                                    
x
+
TestOffline (61.57s)

                                                
                                                
=== RUN   TestOffline
=== PAUSE TestOffline

                                                
                                                

                                                
                                                
=== CONT  TestOffline

                                                
                                                
=== CONT  TestOffline
aab_offline_test.go:55: (dbg) Run:  out/minikube-darwin-amd64 start -p offline-docker-113537 --alsologtostderr -v=1 --memory=2048 --wait=true --driver=hyperkit 

                                                
                                                
=== CONT  TestOffline
aab_offline_test.go:55: (dbg) Done: out/minikube-darwin-amd64 start -p offline-docker-113537 --alsologtostderr -v=1 --memory=2048 --wait=true --driver=hyperkit : (56.276730759s)
helpers_test.go:175: Cleaning up "offline-docker-113537" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p offline-docker-113537

                                                
                                                
=== CONT  TestOffline
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p offline-docker-113537: (5.297617124s)
--- PASS: TestOffline (61.57s)

                                                
                                    
x
+
TestAddons/Setup (223.84s)

                                                
                                                
=== RUN   TestAddons/Setup
addons_test.go:76: (dbg) Run:  out/minikube-darwin-amd64 start -p addons-104436 --wait=true --memory=4000 --alsologtostderr --addons=registry --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --driver=hyperkit  --addons=ingress --addons=ingress-dns --addons=helm-tiller
addons_test.go:76: (dbg) Done: out/minikube-darwin-amd64 start -p addons-104436 --wait=true --memory=4000 --alsologtostderr --addons=registry --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --driver=hyperkit  --addons=ingress --addons=ingress-dns --addons=helm-tiller: (3m43.839344062s)
--- PASS: TestAddons/Setup (223.84s)

                                                
                                    
x
+
TestAddons/parallel/Registry (22.42s)

                                                
                                                
=== RUN   TestAddons/parallel/Registry
=== PAUSE TestAddons/parallel/Registry

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Registry

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:283: registry stabilized in 8.002267ms
addons_test.go:285: (dbg) TestAddons/parallel/Registry: waiting 6m0s for pods matching "actual-registry=true" in namespace "kube-system" ...

                                                
                                                
=== CONT  TestAddons/parallel/Registry
helpers_test.go:342: "registry-frg9g" [92ff0512-01c1-4552-9e62-d44bfcbb3d55] Running

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:285: (dbg) TestAddons/parallel/Registry: actual-registry=true healthy within 5.0120291s
addons_test.go:288: (dbg) TestAddons/parallel/Registry: waiting 10m0s for pods matching "registry-proxy=true" in namespace "kube-system" ...
helpers_test.go:342: "registry-proxy-hc7kt" [3ceff0b3-846e-480f-80b6-0f97bb832f87] Running

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:288: (dbg) TestAddons/parallel/Registry: registry-proxy=true healthy within 5.009358436s
addons_test.go:293: (dbg) Run:  kubectl --context addons-104436 delete po -l run=registry-test --now
addons_test.go:298: (dbg) Run:  kubectl --context addons-104436 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local"

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:298: (dbg) Done: kubectl --context addons-104436 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local": (11.761216083s)
addons_test.go:312: (dbg) Run:  out/minikube-darwin-amd64 -p addons-104436 ip
2022/11/28 10:48:42 [DEBUG] GET http://192.168.64.45:5000
addons_test.go:341: (dbg) Run:  out/minikube-darwin-amd64 -p addons-104436 addons disable registry --alsologtostderr -v=1
--- PASS: TestAddons/parallel/Registry (22.42s)

                                                
                                    
x
+
TestAddons/parallel/Ingress (20.71s)

                                                
                                                
=== RUN   TestAddons/parallel/Ingress
=== PAUSE TestAddons/parallel/Ingress

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Ingress
addons_test.go:165: (dbg) Run:  kubectl --context addons-104436 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s
addons_test.go:185: (dbg) Run:  kubectl --context addons-104436 replace --force -f testdata/nginx-ingress-v1.yaml

                                                
                                                
=== CONT  TestAddons/parallel/Ingress
addons_test.go:198: (dbg) Run:  kubectl --context addons-104436 replace --force -f testdata/nginx-pod-svc.yaml
addons_test.go:203: (dbg) TestAddons/parallel/Ingress: waiting 8m0s for pods matching "run=nginx" in namespace "default" ...
helpers_test.go:342: "nginx" [4a1f2520-ba19-494d-bd6a-4e394a36768c] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])

                                                
                                                
=== CONT  TestAddons/parallel/Ingress
helpers_test.go:342: "nginx" [4a1f2520-ba19-494d-bd6a-4e394a36768c] Running

                                                
                                                
=== CONT  TestAddons/parallel/Ingress
addons_test.go:203: (dbg) TestAddons/parallel/Ingress: run=nginx healthy within 10.040993503s
addons_test.go:215: (dbg) Run:  out/minikube-darwin-amd64 -p addons-104436 ssh "curl -s http://127.0.0.1/ -H 'Host: nginx.example.com'"
addons_test.go:239: (dbg) Run:  kubectl --context addons-104436 replace --force -f testdata/ingress-dns-example-v1.yaml
addons_test.go:244: (dbg) Run:  out/minikube-darwin-amd64 -p addons-104436 ip
addons_test.go:250: (dbg) Run:  nslookup hello-john.test 192.168.64.45
addons_test.go:259: (dbg) Run:  out/minikube-darwin-amd64 -p addons-104436 addons disable ingress-dns --alsologtostderr -v=1

                                                
                                                
=== CONT  TestAddons/parallel/Ingress
addons_test.go:259: (dbg) Done: out/minikube-darwin-amd64 -p addons-104436 addons disable ingress-dns --alsologtostderr -v=1: (1.759564757s)
addons_test.go:264: (dbg) Run:  out/minikube-darwin-amd64 -p addons-104436 addons disable ingress --alsologtostderr -v=1

                                                
                                                
=== CONT  TestAddons/parallel/Ingress
addons_test.go:264: (dbg) Done: out/minikube-darwin-amd64 -p addons-104436 addons disable ingress --alsologtostderr -v=1: (7.412068208s)
--- PASS: TestAddons/parallel/Ingress (20.71s)

                                                
                                    
x
+
TestAddons/parallel/MetricsServer (5.62s)

                                                
                                                
=== RUN   TestAddons/parallel/MetricsServer
=== PAUSE TestAddons/parallel/MetricsServer

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/MetricsServer
addons_test.go:360: metrics-server stabilized in 1.924108ms
addons_test.go:362: (dbg) TestAddons/parallel/MetricsServer: waiting 6m0s for pods matching "k8s-app=metrics-server" in namespace "kube-system" ...
helpers_test.go:342: "metrics-server-56c6cfbdd9-swhvb" [749d2051-d7db-48c1-b2d0-aadb37009c9f] Running

                                                
                                                
=== CONT  TestAddons/parallel/MetricsServer
addons_test.go:362: (dbg) TestAddons/parallel/MetricsServer: k8s-app=metrics-server healthy within 5.005934344s
addons_test.go:368: (dbg) Run:  kubectl --context addons-104436 top pods -n kube-system
addons_test.go:385: (dbg) Run:  out/minikube-darwin-amd64 -p addons-104436 addons disable metrics-server --alsologtostderr -v=1
--- PASS: TestAddons/parallel/MetricsServer (5.62s)

                                                
                                    
x
+
TestAddons/parallel/HelmTiller (11.78s)

                                                
                                                
=== RUN   TestAddons/parallel/HelmTiller
=== PAUSE TestAddons/parallel/HelmTiller

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/HelmTiller
addons_test.go:409: tiller-deploy stabilized in 3.439926ms
addons_test.go:411: (dbg) TestAddons/parallel/HelmTiller: waiting 6m0s for pods matching "app=helm" in namespace "kube-system" ...
helpers_test.go:342: "tiller-deploy-696b5bfbb7-r5z7r" [b4b7507f-4bb9-4e75-844d-0924d05d2953] Running

                                                
                                                
=== CONT  TestAddons/parallel/HelmTiller
addons_test.go:411: (dbg) TestAddons/parallel/HelmTiller: app=helm healthy within 5.008975942s
addons_test.go:426: (dbg) Run:  kubectl --context addons-104436 run --rm helm-test --restart=Never --image=alpine/helm:2.16.3 -it --namespace=kube-system -- version
addons_test.go:426: (dbg) Done: kubectl --context addons-104436 run --rm helm-test --restart=Never --image=alpine/helm:2.16.3 -it --namespace=kube-system -- version: (6.417133416s)
addons_test.go:443: (dbg) Run:  out/minikube-darwin-amd64 -p addons-104436 addons disable helm-tiller --alsologtostderr -v=1
--- PASS: TestAddons/parallel/HelmTiller (11.78s)

                                                
                                    
x
+
TestAddons/parallel/CSI (47.85s)

                                                
                                                
=== RUN   TestAddons/parallel/CSI
=== PAUSE TestAddons/parallel/CSI

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CSI
addons_test.go:514: csi-hostpath-driver pods stabilized in 3.444032ms
addons_test.go:517: (dbg) Run:  kubectl --context addons-104436 create -f testdata/csi-hostpath-driver/pvc.yaml

                                                
                                                
=== CONT  TestAddons/parallel/CSI
addons_test.go:522: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc" in namespace "default" ...
helpers_test.go:392: (dbg) Run:  kubectl --context addons-104436 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:392: (dbg) Run:  kubectl --context addons-104436 get pvc hpvc -o jsonpath={.status.phase} -n default
addons_test.go:527: (dbg) Run:  kubectl --context addons-104436 create -f testdata/csi-hostpath-driver/pv-pod.yaml
addons_test.go:532: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod" in namespace "default" ...
helpers_test.go:342: "task-pv-pod" [09c81e45-de23-4a98-810f-448f26b3b64f] Pending
helpers_test.go:342: "task-pv-pod" [09c81e45-de23-4a98-810f-448f26b3b64f] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])

                                                
                                                
=== CONT  TestAddons/parallel/CSI
helpers_test.go:342: "task-pv-pod" [09c81e45-de23-4a98-810f-448f26b3b64f] Running
addons_test.go:532: (dbg) TestAddons/parallel/CSI: app=task-pv-pod healthy within 21.016748107s
addons_test.go:537: (dbg) Run:  kubectl --context addons-104436 create -f testdata/csi-hostpath-driver/snapshot.yaml
addons_test.go:542: (dbg) TestAddons/parallel/CSI: waiting 6m0s for volume snapshot "new-snapshot-demo" in namespace "default" ...
helpers_test.go:417: (dbg) Run:  kubectl --context addons-104436 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
helpers_test.go:417: (dbg) Run:  kubectl --context addons-104436 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
addons_test.go:547: (dbg) Run:  kubectl --context addons-104436 delete pod task-pv-pod
addons_test.go:553: (dbg) Run:  kubectl --context addons-104436 delete pvc hpvc
addons_test.go:559: (dbg) Run:  kubectl --context addons-104436 create -f testdata/csi-hostpath-driver/pvc-restore.yaml
addons_test.go:564: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc-restore" in namespace "default" ...
helpers_test.go:392: (dbg) Run:  kubectl --context addons-104436 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
addons_test.go:569: (dbg) Run:  kubectl --context addons-104436 create -f testdata/csi-hostpath-driver/pv-pod-restore.yaml
addons_test.go:574: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod-restore" in namespace "default" ...
helpers_test.go:342: "task-pv-pod-restore" [b9cce976-71d3-4ce2-9a4a-fa71b4b1ed6e] Pending
helpers_test.go:342: "task-pv-pod-restore" [b9cce976-71d3-4ce2-9a4a-fa71b4b1ed6e] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])
helpers_test.go:342: "task-pv-pod-restore" [b9cce976-71d3-4ce2-9a4a-fa71b4b1ed6e] Running
addons_test.go:574: (dbg) TestAddons/parallel/CSI: app=task-pv-pod-restore healthy within 15.008988144s
addons_test.go:579: (dbg) Run:  kubectl --context addons-104436 delete pod task-pv-pod-restore
addons_test.go:583: (dbg) Run:  kubectl --context addons-104436 delete pvc hpvc-restore
addons_test.go:587: (dbg) Run:  kubectl --context addons-104436 delete volumesnapshot new-snapshot-demo
addons_test.go:591: (dbg) Run:  out/minikube-darwin-amd64 -p addons-104436 addons disable csi-hostpath-driver --alsologtostderr -v=1
addons_test.go:591: (dbg) Done: out/minikube-darwin-amd64 -p addons-104436 addons disable csi-hostpath-driver --alsologtostderr -v=1: (6.632450993s)
addons_test.go:595: (dbg) Run:  out/minikube-darwin-amd64 -p addons-104436 addons disable volumesnapshots --alsologtostderr -v=1
--- PASS: TestAddons/parallel/CSI (47.85s)

                                                
                                    
x
+
TestAddons/parallel/Headlamp (12.93s)

                                                
                                                
=== RUN   TestAddons/parallel/Headlamp
=== PAUSE TestAddons/parallel/Headlamp

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Headlamp
addons_test.go:738: (dbg) Run:  out/minikube-darwin-amd64 addons enable headlamp -p addons-104436 --alsologtostderr -v=1

                                                
                                                
=== CONT  TestAddons/parallel/Headlamp
addons_test.go:743: (dbg) TestAddons/parallel/Headlamp: waiting 8m0s for pods matching "app.kubernetes.io/name=headlamp" in namespace "headlamp" ...
helpers_test.go:342: "headlamp-5f4cf474d8-qswh7" [7b6b2240-70a6-450c-9ff6-13cdd3b84c91] Pending
helpers_test.go:342: "headlamp-5f4cf474d8-qswh7" [7b6b2240-70a6-450c-9ff6-13cdd3b84c91] Pending / Ready:ContainersNotReady (containers with unready status: [headlamp]) / ContainersReady:ContainersNotReady (containers with unready status: [headlamp])

                                                
                                                
=== CONT  TestAddons/parallel/Headlamp
helpers_test.go:342: "headlamp-5f4cf474d8-qswh7" [7b6b2240-70a6-450c-9ff6-13cdd3b84c91] Running

                                                
                                                
=== CONT  TestAddons/parallel/Headlamp
addons_test.go:743: (dbg) TestAddons/parallel/Headlamp: app.kubernetes.io/name=headlamp healthy within 12.043382335s
--- PASS: TestAddons/parallel/Headlamp (12.93s)

                                                
                                    
x
+
TestAddons/parallel/CloudSpanner (5.34s)

                                                
                                                
=== RUN   TestAddons/parallel/CloudSpanner
=== PAUSE TestAddons/parallel/CloudSpanner

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CloudSpanner

                                                
                                                
=== CONT  TestAddons/parallel/CloudSpanner
addons_test.go:759: (dbg) TestAddons/parallel/CloudSpanner: waiting 6m0s for pods matching "app=cloud-spanner-emulator" in namespace "default" ...

                                                
                                                
=== CONT  TestAddons/parallel/CloudSpanner
helpers_test.go:342: "cloud-spanner-emulator-8549f94bf8-hpzqx" [2289a8f5-6b7e-43d9-8b43-116185a5a364] Running
addons_test.go:759: (dbg) TestAddons/parallel/CloudSpanner: app=cloud-spanner-emulator healthy within 5.010404595s
addons_test.go:762: (dbg) Run:  out/minikube-darwin-amd64 addons disable cloud-spanner -p addons-104436
--- PASS: TestAddons/parallel/CloudSpanner (5.34s)

                                                
                                    
x
+
TestAddons/serial/GCPAuth (18.39s)

                                                
                                                
=== RUN   TestAddons/serial/GCPAuth
addons_test.go:606: (dbg) Run:  kubectl --context addons-104436 create -f testdata/busybox.yaml
addons_test.go:613: (dbg) Run:  kubectl --context addons-104436 create sa gcp-auth-test
addons_test.go:619: (dbg) TestAddons/serial/GCPAuth: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:342: "busybox" [e69836f8-27d7-4925-b7a6-f16c1c90c564] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:342: "busybox" [e69836f8-27d7-4925-b7a6-f16c1c90c564] Running
addons_test.go:619: (dbg) TestAddons/serial/GCPAuth: integration-test=busybox healthy within 12.006828456s
addons_test.go:625: (dbg) Run:  kubectl --context addons-104436 exec busybox -- /bin/sh -c "printenv GOOGLE_APPLICATION_CREDENTIALS"
addons_test.go:637: (dbg) Run:  kubectl --context addons-104436 describe sa gcp-auth-test
addons_test.go:651: (dbg) Run:  kubectl --context addons-104436 exec busybox -- /bin/sh -c "cat /google-app-creds.json"
addons_test.go:675: (dbg) Run:  kubectl --context addons-104436 exec busybox -- /bin/sh -c "printenv GOOGLE_CLOUD_PROJECT"
addons_test.go:688: (dbg) Run:  out/minikube-darwin-amd64 -p addons-104436 addons disable gcp-auth --alsologtostderr -v=1
addons_test.go:688: (dbg) Done: out/minikube-darwin-amd64 -p addons-104436 addons disable gcp-auth --alsologtostderr -v=1: (5.864009844s)
--- PASS: TestAddons/serial/GCPAuth (18.39s)

                                                
                                    
x
+
TestAddons/StoppedEnableDisable (2.58s)

                                                
                                                
=== RUN   TestAddons/StoppedEnableDisable
addons_test.go:135: (dbg) Run:  out/minikube-darwin-amd64 stop -p addons-104436
addons_test.go:135: (dbg) Done: out/minikube-darwin-amd64 stop -p addons-104436: (2.208707692s)
addons_test.go:139: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p addons-104436
addons_test.go:143: (dbg) Run:  out/minikube-darwin-amd64 addons disable dashboard -p addons-104436
--- PASS: TestAddons/StoppedEnableDisable (2.58s)

                                                
                                    
x
+
TestCertOptions (45.47s)

                                                
                                                
=== RUN   TestCertOptions
=== PAUSE TestCertOptions

                                                
                                                

                                                
                                                
=== CONT  TestCertOptions
cert_options_test.go:49: (dbg) Run:  out/minikube-darwin-amd64 start -p cert-options-113734 --memory=2048 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=hyperkit 

                                                
                                                
=== CONT  TestCertOptions
cert_options_test.go:49: (dbg) Done: out/minikube-darwin-amd64 start -p cert-options-113734 --memory=2048 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=hyperkit : (39.820010336s)
cert_options_test.go:60: (dbg) Run:  out/minikube-darwin-amd64 -p cert-options-113734 ssh "openssl x509 -text -noout -in /var/lib/minikube/certs/apiserver.crt"
cert_options_test.go:88: (dbg) Run:  kubectl --context cert-options-113734 config view
cert_options_test.go:100: (dbg) Run:  out/minikube-darwin-amd64 ssh -p cert-options-113734 -- "sudo cat /etc/kubernetes/admin.conf"
helpers_test.go:175: Cleaning up "cert-options-113734" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p cert-options-113734
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p cert-options-113734: (5.285815576s)
--- PASS: TestCertOptions (45.47s)

                                                
                                    
x
+
TestCertExpiration (253.23s)

                                                
                                                
=== RUN   TestCertExpiration
=== PAUSE TestCertExpiration

                                                
                                                

                                                
                                                
=== CONT  TestCertExpiration
cert_options_test.go:123: (dbg) Run:  out/minikube-darwin-amd64 start -p cert-expiration-113723 --memory=2048 --cert-expiration=3m --driver=hyperkit 

                                                
                                                
=== CONT  TestCertExpiration
cert_options_test.go:123: (dbg) Done: out/minikube-darwin-amd64 start -p cert-expiration-113723 --memory=2048 --cert-expiration=3m --driver=hyperkit : (40.63329231s)

                                                
                                                
=== CONT  TestCertExpiration
cert_options_test.go:131: (dbg) Run:  out/minikube-darwin-amd64 start -p cert-expiration-113723 --memory=2048 --cert-expiration=8760h --driver=hyperkit 

                                                
                                                
=== CONT  TestCertExpiration
cert_options_test.go:131: (dbg) Done: out/minikube-darwin-amd64 start -p cert-expiration-113723 --memory=2048 --cert-expiration=8760h --driver=hyperkit : (27.320705466s)
helpers_test.go:175: Cleaning up "cert-expiration-113723" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p cert-expiration-113723
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p cert-expiration-113723: (5.271032705s)
--- PASS: TestCertExpiration (253.23s)

                                                
                                    
x
+
TestDockerFlags (54.71s)

                                                
                                                
=== RUN   TestDockerFlags
=== PAUSE TestDockerFlags

                                                
                                                

                                                
                                                
=== CONT  TestDockerFlags
docker_test.go:45: (dbg) Run:  out/minikube-darwin-amd64 start -p docker-flags-113639 --cache-images=false --memory=2048 --install-addons=false --wait=false --docker-env=FOO=BAR --docker-env=BAZ=BAT --docker-opt=debug --docker-opt=icc=true --alsologtostderr -v=5 --driver=hyperkit 
E1128 11:36:46.345030   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory

                                                
                                                
=== CONT  TestDockerFlags
docker_test.go:45: (dbg) Done: out/minikube-darwin-amd64 start -p docker-flags-113639 --cache-images=false --memory=2048 --install-addons=false --wait=false --docker-env=FOO=BAR --docker-env=BAZ=BAT --docker-opt=debug --docker-opt=icc=true --alsologtostderr -v=5 --driver=hyperkit : (50.935768678s)
docker_test.go:50: (dbg) Run:  out/minikube-darwin-amd64 -p docker-flags-113639 ssh "sudo systemctl show docker --property=Environment --no-pager"
docker_test.go:61: (dbg) Run:  out/minikube-darwin-amd64 -p docker-flags-113639 ssh "sudo systemctl show docker --property=ExecStart --no-pager"
helpers_test.go:175: Cleaning up "docker-flags-113639" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p docker-flags-113639
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p docker-flags-113639: (3.459643209s)
--- PASS: TestDockerFlags (54.71s)

                                                
                                    
x
+
TestForceSystemdFlag (44.93s)

                                                
                                                
=== RUN   TestForceSystemdFlag
=== PAUSE TestForceSystemdFlag

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdFlag
docker_test.go:85: (dbg) Run:  out/minikube-darwin-amd64 start -p force-systemd-flag-113638 --memory=2048 --force-systemd --alsologtostderr -v=5 --driver=hyperkit 

                                                
                                                
=== CONT  TestForceSystemdFlag
docker_test.go:85: (dbg) Done: out/minikube-darwin-amd64 start -p force-systemd-flag-113638 --memory=2048 --force-systemd --alsologtostderr -v=5 --driver=hyperkit : (41.304644669s)
docker_test.go:104: (dbg) Run:  out/minikube-darwin-amd64 -p force-systemd-flag-113638 ssh "docker info --format {{.CgroupDriver}}"
helpers_test.go:175: Cleaning up "force-systemd-flag-113638" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p force-systemd-flag-113638
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p force-systemd-flag-113638: (3.446577207s)
--- PASS: TestForceSystemdFlag (44.93s)

                                                
                                    
x
+
TestForceSystemdEnv (43.28s)

                                                
                                                
=== RUN   TestForceSystemdEnv
=== PAUSE TestForceSystemdEnv

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdEnv
docker_test.go:149: (dbg) Run:  out/minikube-darwin-amd64 start -p force-systemd-env-113556 --memory=2048 --alsologtostderr -v=5 --driver=hyperkit 

                                                
                                                
=== CONT  TestForceSystemdEnv
docker_test.go:149: (dbg) Done: out/minikube-darwin-amd64 start -p force-systemd-env-113556 --memory=2048 --alsologtostderr -v=5 --driver=hyperkit : (39.667801509s)
docker_test.go:104: (dbg) Run:  out/minikube-darwin-amd64 -p force-systemd-env-113556 ssh "docker info --format {{.CgroupDriver}}"
helpers_test.go:175: Cleaning up "force-systemd-env-113556" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p force-systemd-env-113556

                                                
                                                
=== CONT  TestForceSystemdEnv
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p force-systemd-env-113556: (3.449439202s)
--- PASS: TestForceSystemdEnv (43.28s)

                                                
                                    
x
+
TestHyperKitDriverInstallOrUpdate (8.83s)

                                                
                                                
=== RUN   TestHyperKitDriverInstallOrUpdate
=== PAUSE TestHyperKitDriverInstallOrUpdate

                                                
                                                

                                                
                                                
=== CONT  TestHyperKitDriverInstallOrUpdate
--- PASS: TestHyperKitDriverInstallOrUpdate (8.83s)

                                                
                                    
x
+
TestErrorSpam/setup (41.44s)

                                                
                                                
=== RUN   TestErrorSpam/setup
error_spam_test.go:81: (dbg) Run:  out/minikube-darwin-amd64 start -p nospam-105005 -n=1 --memory=2250 --wait=false --log_dir=/var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-105005 --driver=hyperkit 
error_spam_test.go:81: (dbg) Done: out/minikube-darwin-amd64 start -p nospam-105005 -n=1 --memory=2250 --wait=false --log_dir=/var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-105005 --driver=hyperkit : (41.435529106s)
--- PASS: TestErrorSpam/setup (41.44s)

                                                
                                    
x
+
TestErrorSpam/start (1.3s)

                                                
                                                
=== RUN   TestErrorSpam/start
error_spam_test.go:216: Cleaning up 1 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-105005 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-105005 start --dry-run
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-105005 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-105005 start --dry-run
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-105005 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-105005 start --dry-run
--- PASS: TestErrorSpam/start (1.30s)

                                                
                                    
x
+
TestErrorSpam/status (0.5s)

                                                
                                                
=== RUN   TestErrorSpam/status
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-105005 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-105005 status
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-105005 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-105005 status
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-105005 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-105005 status
--- PASS: TestErrorSpam/status (0.50s)

                                                
                                    
x
+
TestErrorSpam/pause (1.31s)

                                                
                                                
=== RUN   TestErrorSpam/pause
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-105005 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-105005 pause
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-105005 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-105005 pause
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-105005 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-105005 pause
--- PASS: TestErrorSpam/pause (1.31s)

                                                
                                    
x
+
TestErrorSpam/unpause (1.31s)

                                                
                                                
=== RUN   TestErrorSpam/unpause
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-105005 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-105005 unpause
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-105005 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-105005 unpause
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-105005 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-105005 unpause
--- PASS: TestErrorSpam/unpause (1.31s)

                                                
                                    
x
+
TestErrorSpam/stop (8.68s)

                                                
                                                
=== RUN   TestErrorSpam/stop
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-105005 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-105005 stop
error_spam_test.go:159: (dbg) Done: out/minikube-darwin-amd64 -p nospam-105005 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-105005 stop: (8.244586149s)
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-105005 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-105005 stop
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-105005 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-105005 stop
--- PASS: TestErrorSpam/stop (8.68s)

                                                
                                    
x
+
TestFunctional/serial/CopySyncFile (0s)

                                                
                                                
=== RUN   TestFunctional/serial/CopySyncFile
functional_test.go:1782: local sync path: /Users/jenkins/minikube-integration/15411-14646/.minikube/files/etc/test/nested/copy/15823/hosts
--- PASS: TestFunctional/serial/CopySyncFile (0.00s)

                                                
                                    
x
+
TestFunctional/serial/StartWithProxy (63.38s)

                                                
                                                
=== RUN   TestFunctional/serial/StartWithProxy
functional_test.go:2161: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-105100 --memory=4000 --apiserver-port=8441 --wait=all --driver=hyperkit 
functional_test.go:2161: (dbg) Done: out/minikube-darwin-amd64 start -p functional-105100 --memory=4000 --apiserver-port=8441 --wait=all --driver=hyperkit : (1m3.379720557s)
--- PASS: TestFunctional/serial/StartWithProxy (63.38s)

                                                
                                    
x
+
TestFunctional/serial/AuditLog (0s)

                                                
                                                
=== RUN   TestFunctional/serial/AuditLog
--- PASS: TestFunctional/serial/AuditLog (0.00s)

                                                
                                    
x
+
TestFunctional/serial/SoftStart (40.52s)

                                                
                                                
=== RUN   TestFunctional/serial/SoftStart
functional_test.go:652: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-105100 --alsologtostderr -v=8
functional_test.go:652: (dbg) Done: out/minikube-darwin-amd64 start -p functional-105100 --alsologtostderr -v=8: (40.516504223s)
functional_test.go:656: soft start took 40.517022471s for "functional-105100" cluster.
--- PASS: TestFunctional/serial/SoftStart (40.52s)

                                                
                                    
x
+
TestFunctional/serial/KubeContext (0.03s)

                                                
                                                
=== RUN   TestFunctional/serial/KubeContext
functional_test.go:674: (dbg) Run:  kubectl config current-context
--- PASS: TestFunctional/serial/KubeContext (0.03s)

                                                
                                    
x
+
TestFunctional/serial/KubectlGetPods (0.07s)

                                                
                                                
=== RUN   TestFunctional/serial/KubectlGetPods
functional_test.go:689: (dbg) Run:  kubectl --context functional-105100 get po -A
--- PASS: TestFunctional/serial/KubectlGetPods (0.07s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_remote (14.75s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_remote
functional_test.go:1042: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 cache add k8s.gcr.io/pause:3.1
functional_test.go:1042: (dbg) Done: out/minikube-darwin-amd64 -p functional-105100 cache add k8s.gcr.io/pause:3.1: (4.927545452s)
functional_test.go:1042: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 cache add k8s.gcr.io/pause:3.3
functional_test.go:1042: (dbg) Done: out/minikube-darwin-amd64 -p functional-105100 cache add k8s.gcr.io/pause:3.3: (5.43514337s)
functional_test.go:1042: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 cache add k8s.gcr.io/pause:latest
functional_test.go:1042: (dbg) Done: out/minikube-darwin-amd64 -p functional-105100 cache add k8s.gcr.io/pause:latest: (4.388634707s)
--- PASS: TestFunctional/serial/CacheCmd/cache/add_remote (14.75s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_local (1.49s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_local
functional_test.go:1070: (dbg) Run:  docker build -t minikube-local-cache-test:functional-105100 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalserialCacheCmdcacheadd_local3044442015/001
functional_test.go:1082: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 cache add minikube-local-cache-test:functional-105100
functional_test.go:1087: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 cache delete minikube-local-cache-test:functional-105100
functional_test.go:1076: (dbg) Run:  docker rmi minikube-local-cache-test:functional-105100
--- PASS: TestFunctional/serial/CacheCmd/cache/add_local (1.49s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/delete_k8s.gcr.io/pause:3.3 (0.08s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/delete_k8s.gcr.io/pause:3.3
functional_test.go:1095: (dbg) Run:  out/minikube-darwin-amd64 cache delete k8s.gcr.io/pause:3.3
--- PASS: TestFunctional/serial/CacheCmd/cache/delete_k8s.gcr.io/pause:3.3 (0.08s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/list (0.08s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/list
functional_test.go:1103: (dbg) Run:  out/minikube-darwin-amd64 cache list
--- PASS: TestFunctional/serial/CacheCmd/cache/list (0.08s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.17s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node
functional_test.go:1117: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 ssh sudo crictl images
--- PASS: TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.17s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/cache_reload (3.07s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/cache_reload
functional_test.go:1140: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 ssh sudo docker rmi k8s.gcr.io/pause:latest
functional_test.go:1146: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 ssh sudo crictl inspecti k8s.gcr.io/pause:latest
functional_test.go:1146: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-105100 ssh sudo crictl inspecti k8s.gcr.io/pause:latest: exit status 1 (144.65022ms)

                                                
                                                
-- stdout --
	FATA[0000] no such image "k8s.gcr.io/pause:latest" present 

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:1151: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 cache reload
functional_test.go:1151: (dbg) Done: out/minikube-darwin-amd64 -p functional-105100 cache reload: (2.583401131s)
functional_test.go:1156: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 ssh sudo crictl inspecti k8s.gcr.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/cache_reload (3.07s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/delete (0.16s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/delete
functional_test.go:1165: (dbg) Run:  out/minikube-darwin-amd64 cache delete k8s.gcr.io/pause:3.1
functional_test.go:1165: (dbg) Run:  out/minikube-darwin-amd64 cache delete k8s.gcr.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/delete (0.16s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmd (0.49s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmd
functional_test.go:709: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 kubectl -- --context functional-105100 get pods
--- PASS: TestFunctional/serial/MinikubeKubectlCmd (0.49s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmdDirectly (0.65s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmdDirectly
functional_test.go:734: (dbg) Run:  out/kubectl --context functional-105100 get pods
--- PASS: TestFunctional/serial/MinikubeKubectlCmdDirectly (0.65s)

                                                
                                    
x
+
TestFunctional/serial/ExtraConfig (42.66s)

                                                
                                                
=== RUN   TestFunctional/serial/ExtraConfig
functional_test.go:750: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-105100 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all
E1128 10:53:20.232289   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
E1128 10:53:20.239703   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
E1128 10:53:20.251961   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
E1128 10:53:20.272354   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
E1128 10:53:20.313327   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
E1128 10:53:20.393729   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
E1128 10:53:20.554472   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
E1128 10:53:20.874708   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
E1128 10:53:21.515764   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
E1128 10:53:22.798015   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
E1128 10:53:25.358201   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
E1128 10:53:30.480364   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
E1128 10:53:40.720996   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
functional_test.go:750: (dbg) Done: out/minikube-darwin-amd64 start -p functional-105100 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all: (42.658157654s)
functional_test.go:754: restart took 42.658315532s for "functional-105100" cluster.
--- PASS: TestFunctional/serial/ExtraConfig (42.66s)

                                                
                                    
x
+
TestFunctional/serial/ComponentHealth (0.05s)

                                                
                                                
=== RUN   TestFunctional/serial/ComponentHealth
functional_test.go:803: (dbg) Run:  kubectl --context functional-105100 get po -l tier=control-plane -n kube-system -o=json
functional_test.go:818: etcd phase: Running
functional_test.go:828: etcd status: Ready
functional_test.go:818: kube-apiserver phase: Running
functional_test.go:828: kube-apiserver status: Ready
functional_test.go:818: kube-controller-manager phase: Running
functional_test.go:828: kube-controller-manager status: Ready
functional_test.go:818: kube-scheduler phase: Running
functional_test.go:828: kube-scheduler status: Ready
--- PASS: TestFunctional/serial/ComponentHealth (0.05s)

                                                
                                    
x
+
TestFunctional/serial/LogsCmd (2.52s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsCmd
functional_test.go:1229: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 logs
functional_test.go:1229: (dbg) Done: out/minikube-darwin-amd64 -p functional-105100 logs: (2.518618586s)
--- PASS: TestFunctional/serial/LogsCmd (2.52s)

                                                
                                    
x
+
TestFunctional/serial/LogsFileCmd (2.66s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsFileCmd
functional_test.go:1243: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 logs --file /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalserialLogsFileCmd252559006/001/logs.txt
functional_test.go:1243: (dbg) Done: out/minikube-darwin-amd64 -p functional-105100 logs --file /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalserialLogsFileCmd252559006/001/logs.txt: (2.654817899s)
--- PASS: TestFunctional/serial/LogsFileCmd (2.66s)

                                                
                                    
x
+
TestFunctional/parallel/ConfigCmd (0.51s)

                                                
                                                
=== RUN   TestFunctional/parallel/ConfigCmd
=== PAUSE TestFunctional/parallel/ConfigCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1192: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 config unset cpus

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1192: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 config get cpus

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1192: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-105100 config get cpus: exit status 14 (71.407979ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
functional_test.go:1192: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 config set cpus 2
functional_test.go:1192: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 config get cpus

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1192: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 config unset cpus
functional_test.go:1192: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 config get cpus

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1192: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-105100 config get cpus: exit status 14 (56.238715ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/ConfigCmd (0.51s)

                                                
                                    
x
+
TestFunctional/parallel/DashboardCmd (13.09s)

                                                
                                                
=== RUN   TestFunctional/parallel/DashboardCmd
=== PAUSE TestFunctional/parallel/DashboardCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DashboardCmd
functional_test.go:898: (dbg) daemon: [out/minikube-darwin-amd64 dashboard --url --port 36195 -p functional-105100 --alsologtostderr -v=1]

                                                
                                                
=== CONT  TestFunctional/parallel/DashboardCmd
functional_test.go:903: (dbg) stopping [out/minikube-darwin-amd64 dashboard --url --port 36195 -p functional-105100 --alsologtostderr -v=1] ...
helpers_test.go:506: unable to kill pid 17716: os: process already finished
--- PASS: TestFunctional/parallel/DashboardCmd (13.09s)

                                                
                                    
x
+
TestFunctional/parallel/DryRun (1.03s)

                                                
                                                
=== RUN   TestFunctional/parallel/DryRun
=== PAUSE TestFunctional/parallel/DryRun

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DryRun
functional_test.go:967: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-105100 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit 
functional_test.go:967: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p functional-105100 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit : exit status 23 (613.586617ms)

                                                
                                                
-- stdout --
	* [functional-105100] minikube v1.28.0 on Darwin 13.0.1
	  - MINIKUBE_LOCATION=15411
	  - KUBECONFIG=/Users/jenkins/minikube-integration/15411-14646/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/15411-14646/.minikube
	* Using the hyperkit driver based on existing profile
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1128 10:54:29.157116   17694 out.go:296] Setting OutFile to fd 1 ...
	I1128 10:54:29.157300   17694 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1128 10:54:29.157305   17694 out.go:309] Setting ErrFile to fd 2...
	I1128 10:54:29.157309   17694 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1128 10:54:29.157415   17694 root.go:334] Updating PATH: /Users/jenkins/minikube-integration/15411-14646/.minikube/bin
	I1128 10:54:29.157909   17694 out.go:303] Setting JSON to false
	I1128 10:54:29.176841   17694 start.go:124] hostinfo: {"hostname":"MacOS-Agent-2.local","uptime":6844,"bootTime":1669654825,"procs":426,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"13.0.1","kernelVersion":"22.1.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"2965c349-98a5-5970-aaa9-9eedd3ae5959"}
	W1128 10:54:29.176954   17694 start.go:132] gopshost.Virtualization returned error: not implemented yet
	I1128 10:54:29.198958   17694 out.go:177] * [functional-105100] minikube v1.28.0 on Darwin 13.0.1
	I1128 10:54:29.220069   17694 notify.go:220] Checking for updates...
	I1128 10:54:29.241623   17694 out.go:177]   - MINIKUBE_LOCATION=15411
	I1128 10:54:29.299909   17694 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/15411-14646/kubeconfig
	I1128 10:54:29.357851   17694 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I1128 10:54:29.416687   17694 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1128 10:54:29.492058   17694 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/15411-14646/.minikube
	I1128 10:54:29.515594   17694 config.go:180] Loaded profile config "functional-105100": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.3
	I1128 10:54:29.516353   17694 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 10:54:29.516450   17694 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 10:54:29.524219   17694 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:55794
	I1128 10:54:29.524618   17694 main.go:134] libmachine: () Calling .GetVersion
	I1128 10:54:29.525045   17694 main.go:134] libmachine: Using API Version  1
	I1128 10:54:29.525058   17694 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 10:54:29.525282   17694 main.go:134] libmachine: () Calling .GetMachineName
	I1128 10:54:29.525375   17694 main.go:134] libmachine: (functional-105100) Calling .DriverName
	I1128 10:54:29.525484   17694 driver.go:365] Setting default libvirt URI to qemu:///system
	I1128 10:54:29.525754   17694 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 10:54:29.525782   17694 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 10:54:29.532514   17694 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:55796
	I1128 10:54:29.532871   17694 main.go:134] libmachine: () Calling .GetVersion
	I1128 10:54:29.533192   17694 main.go:134] libmachine: Using API Version  1
	I1128 10:54:29.533205   17694 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 10:54:29.533385   17694 main.go:134] libmachine: () Calling .GetMachineName
	I1128 10:54:29.533482   17694 main.go:134] libmachine: (functional-105100) Calling .DriverName
	I1128 10:54:29.560869   17694 out.go:177] * Using the hyperkit driver based on existing profile
	I1128 10:54:29.602802   17694 start.go:293] selected driver: hyperkit
	I1128 10:54:29.602875   17694 start.go:837] validating driver "hyperkit" against &{Name:functional-105100 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/15235/minikube-v1.28.0-1668700269-15235-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.36-1668787669-15272@sha256:06094fc04b5dc02fbf1e2de7723c2a6db5d24c21fd2ddda91f6daaf29038cd9c Memory:4000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 Kubernet
esConfig:{KubernetesVersion:v1.25.3 ClusterName:functional-105100 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8441 NodeName:} Nodes:[{Name: IP:192.168.64.47 Port:8441 KubernetesVersion:v1.25.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false istio:false istio-provisioner:false kong:false kubevirt:false logviewer:false metallb:false metrics-ser
ver:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false volumesnapshots:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath:/opt/socket_vmnet/bin/socket_vmnet_client SocketVMnetPath:/var/run/socket_vmnet}
	I1128 10:54:29.603102   17694 start.go:848] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1128 10:54:29.628791   17694 out.go:177] 
	W1128 10:54:29.650134   17694 out.go:239] X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	I1128 10:54:29.671795   17694 out.go:177] 

                                                
                                                
** /stderr **
functional_test.go:984: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-105100 --dry-run --alsologtostderr -v=1 --driver=hyperkit 
--- PASS: TestFunctional/parallel/DryRun (1.03s)

                                                
                                    
x
+
TestFunctional/parallel/InternationalLanguage (0.46s)

                                                
                                                
=== RUN   TestFunctional/parallel/InternationalLanguage
=== PAUSE TestFunctional/parallel/InternationalLanguage

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/InternationalLanguage
functional_test.go:1013: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-105100 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit 
functional_test.go:1013: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p functional-105100 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit : exit status 23 (461.416577ms)

                                                
                                                
-- stdout --
	* [functional-105100] minikube v1.28.0 sur Darwin 13.0.1
	  - MINIKUBE_LOCATION=15411
	  - KUBECONFIG=/Users/jenkins/minikube-integration/15411-14646/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/15411-14646/.minikube
	* Utilisation du pilote hyperkit basé sur le profil existant
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1128 10:54:28.690811   17687 out.go:296] Setting OutFile to fd 1 ...
	I1128 10:54:28.691022   17687 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1128 10:54:28.691027   17687 out.go:309] Setting ErrFile to fd 2...
	I1128 10:54:28.691031   17687 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1128 10:54:28.691156   17687 root.go:334] Updating PATH: /Users/jenkins/minikube-integration/15411-14646/.minikube/bin
	I1128 10:54:28.691656   17687 out.go:303] Setting JSON to false
	I1128 10:54:28.711352   17687 start.go:124] hostinfo: {"hostname":"MacOS-Agent-2.local","uptime":6843,"bootTime":1669654825,"procs":428,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"13.0.1","kernelVersion":"22.1.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"2965c349-98a5-5970-aaa9-9eedd3ae5959"}
	W1128 10:54:28.711481   17687 start.go:132] gopshost.Virtualization returned error: not implemented yet
	I1128 10:54:28.731805   17687 out.go:177] * [functional-105100] minikube v1.28.0 sur Darwin 13.0.1
	I1128 10:54:28.773999   17687 notify.go:220] Checking for updates...
	I1128 10:54:28.795716   17687 out.go:177]   - MINIKUBE_LOCATION=15411
	I1128 10:54:28.816523   17687 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/15411-14646/kubeconfig
	I1128 10:54:28.838035   17687 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I1128 10:54:28.859911   17687 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1128 10:54:28.880726   17687 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/15411-14646/.minikube
	I1128 10:54:28.902822   17687 config.go:180] Loaded profile config "functional-105100": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.3
	I1128 10:54:28.903517   17687 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 10:54:28.903602   17687 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 10:54:28.911409   17687 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:55787
	I1128 10:54:28.911816   17687 main.go:134] libmachine: () Calling .GetVersion
	I1128 10:54:28.912207   17687 main.go:134] libmachine: Using API Version  1
	I1128 10:54:28.912217   17687 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 10:54:28.912404   17687 main.go:134] libmachine: () Calling .GetMachineName
	I1128 10:54:28.912497   17687 main.go:134] libmachine: (functional-105100) Calling .DriverName
	I1128 10:54:28.912609   17687 driver.go:365] Setting default libvirt URI to qemu:///system
	I1128 10:54:28.912883   17687 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 10:54:28.912916   17687 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 10:54:28.919596   17687 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:55789
	I1128 10:54:28.919960   17687 main.go:134] libmachine: () Calling .GetVersion
	I1128 10:54:28.920327   17687 main.go:134] libmachine: Using API Version  1
	I1128 10:54:28.920345   17687 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 10:54:28.920571   17687 main.go:134] libmachine: () Calling .GetMachineName
	I1128 10:54:28.920676   17687 main.go:134] libmachine: (functional-105100) Calling .DriverName
	I1128 10:54:28.947912   17687 out.go:177] * Utilisation du pilote hyperkit basé sur le profil existant
	I1128 10:54:28.989966   17687 start.go:293] selected driver: hyperkit
	I1128 10:54:28.990000   17687 start.go:837] validating driver "hyperkit" against &{Name:functional-105100 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/15235/minikube-v1.28.0-1668700269-15235-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.36-1668787669-15272@sha256:06094fc04b5dc02fbf1e2de7723c2a6db5d24c21fd2ddda91f6daaf29038cd9c Memory:4000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 Kubernet
esConfig:{KubernetesVersion:v1.25.3 ClusterName:functional-105100 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8441 NodeName:} Nodes:[{Name: IP:192.168.64.47 Port:8441 KubernetesVersion:v1.25.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false istio:false istio-provisioner:false kong:false kubevirt:false logviewer:false metallb:false metrics-ser
ver:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false volumesnapshots:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath:/opt/socket_vmnet/bin/socket_vmnet_client SocketVMnetPath:/var/run/socket_vmnet}
	I1128 10:54:28.990273   17687 start.go:848] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1128 10:54:29.016149   17687 out.go:177] 
	W1128 10:54:29.037830   17687 out.go:239] X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	I1128 10:54:29.058879   17687 out.go:177] 

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/InternationalLanguage (0.46s)

                                                
                                    
x
+
TestFunctional/parallel/StatusCmd (0.52s)

                                                
                                                
=== RUN   TestFunctional/parallel/StatusCmd
=== PAUSE TestFunctional/parallel/StatusCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/StatusCmd
functional_test.go:847: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 status
functional_test.go:853: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}
functional_test.go:865: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 status -o json
--- PASS: TestFunctional/parallel/StatusCmd (0.52s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd (9.23s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd
=== PAUSE TestFunctional/parallel/ServiceCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmd
functional_test.go:1433: (dbg) Run:  kubectl --context functional-105100 create deployment hello-node --image=k8s.gcr.io/echoserver:1.8
functional_test.go:1439: (dbg) Run:  kubectl --context functional-105100 expose deployment hello-node --type=NodePort --port=8080
functional_test.go:1444: (dbg) TestFunctional/parallel/ServiceCmd: waiting 10m0s for pods matching "app=hello-node" in namespace "default" ...
helpers_test.go:342: "hello-node-5fcdfb5cc4-b92kt" [4de2901f-e7e6-4ee4-a5cd-7e3b32af997f] Pending / Ready:ContainersNotReady (containers with unready status: [echoserver]) / ContainersReady:ContainersNotReady (containers with unready status: [echoserver])
helpers_test.go:342: "hello-node-5fcdfb5cc4-b92kt" [4de2901f-e7e6-4ee4-a5cd-7e3b32af997f] Running

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmd
functional_test.go:1444: (dbg) TestFunctional/parallel/ServiceCmd: app=hello-node healthy within 8.008606613s
functional_test.go:1449: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 service list
functional_test.go:1463: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 service --namespace=default --https --url hello-node
functional_test.go:1476: found endpoint: https://192.168.64.47:32520
functional_test.go:1491: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 service hello-node --url --format={{.IP}}
functional_test.go:1505: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 service hello-node --url
functional_test.go:1511: found endpoint for hello-node: http://192.168.64.47:32520
--- PASS: TestFunctional/parallel/ServiceCmd (9.23s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmdConnect (13.35s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmdConnect
=== PAUSE TestFunctional/parallel/ServiceCmdConnect

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmdConnect
functional_test.go:1559: (dbg) Run:  kubectl --context functional-105100 create deployment hello-node-connect --image=k8s.gcr.io/echoserver:1.8
functional_test.go:1565: (dbg) Run:  kubectl --context functional-105100 expose deployment hello-node-connect --type=NodePort --port=8080
functional_test.go:1570: (dbg) TestFunctional/parallel/ServiceCmdConnect: waiting 10m0s for pods matching "app=hello-node-connect" in namespace "default" ...
helpers_test.go:342: "hello-node-connect-6458c8fb6f-mr5x2" [21cb6ade-8af8-4d75-aa90-6d7a3c4d64fa] Pending / Ready:ContainersNotReady (containers with unready status: [echoserver]) / ContainersReady:ContainersNotReady (containers with unready status: [echoserver])

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmdConnect
helpers_test.go:342: "hello-node-connect-6458c8fb6f-mr5x2" [21cb6ade-8af8-4d75-aa90-6d7a3c4d64fa] Running

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmdConnect
functional_test.go:1570: (dbg) TestFunctional/parallel/ServiceCmdConnect: app=hello-node-connect healthy within 13.007487906s
functional_test.go:1579: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 service hello-node-connect --url
functional_test.go:1585: found endpoint for hello-node-connect: http://192.168.64.47:31343
functional_test.go:1605: http://192.168.64.47:31343: success! body:

                                                
                                                

                                                
                                                
Hostname: hello-node-connect-6458c8fb6f-mr5x2

                                                
                                                
Pod Information:
	-no pod information available-

                                                
                                                
Server values:
	server_version=nginx: 1.13.3 - lua: 10008

                                                
                                                
Request Information:
	client_address=172.17.0.1
	method=GET
	real path=/
	query=
	request_version=1.1
	request_uri=http://192.168.64.47:8080/

                                                
                                                
Request Headers:
	accept-encoding=gzip
	host=192.168.64.47:31343
	user-agent=Go-http-client/1.1

                                                
                                                
Request Body:
	-no body in request-

                                                
                                                
--- PASS: TestFunctional/parallel/ServiceCmdConnect (13.35s)

                                                
                                    
x
+
TestFunctional/parallel/AddonsCmd (0.28s)

                                                
                                                
=== RUN   TestFunctional/parallel/AddonsCmd
=== PAUSE TestFunctional/parallel/AddonsCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/AddonsCmd
functional_test.go:1620: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 addons list
functional_test.go:1632: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 addons list -o json
--- PASS: TestFunctional/parallel/AddonsCmd (0.28s)

                                                
                                    
x
+
TestFunctional/parallel/PersistentVolumeClaim (28.64s)

                                                
                                                
=== RUN   TestFunctional/parallel/PersistentVolumeClaim
=== PAUSE TestFunctional/parallel/PersistentVolumeClaim

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:44: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "integration-test=storage-provisioner" in namespace "kube-system" ...
helpers_test.go:342: "storage-provisioner" [e86e5a06-33b7-462b-8a63-ee3bce5f75fc] Running

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:44: (dbg) TestFunctional/parallel/PersistentVolumeClaim: integration-test=storage-provisioner healthy within 5.006174484s
functional_test_pvc_test.go:49: (dbg) Run:  kubectl --context functional-105100 get storageclass -o=json
functional_test_pvc_test.go:69: (dbg) Run:  kubectl --context functional-105100 apply -f testdata/storage-provisioner/pvc.yaml
functional_test_pvc_test.go:76: (dbg) Run:  kubectl --context functional-105100 get pvc myclaim -o=json
functional_test_pvc_test.go:125: (dbg) Run:  kubectl --context functional-105100 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 3m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:342: "sp-pod" [1ba913f4-9187-449e-94bd-6287c4aa3719] Pending
helpers_test.go:342: "sp-pod" [1ba913f4-9187-449e-94bd-6287c4aa3719] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])
E1128 10:54:01.311657   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
helpers_test.go:342: "sp-pod" [1ba913f4-9187-449e-94bd-6287c4aa3719] Running
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 14.006089559s
functional_test_pvc_test.go:100: (dbg) Run:  kubectl --context functional-105100 exec sp-pod -- touch /tmp/mount/foo
functional_test_pvc_test.go:106: (dbg) Run:  kubectl --context functional-105100 delete -f testdata/storage-provisioner/pod.yaml

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:125: (dbg) Run:  kubectl --context functional-105100 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 3m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:342: "sp-pod" [57b1421a-fa3b-49e0-9a97-bc74382620b6] Pending
helpers_test.go:342: "sp-pod" [57b1421a-fa3b-49e0-9a97-bc74382620b6] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])
helpers_test.go:342: "sp-pod" [57b1421a-fa3b-49e0-9a97-bc74382620b6] Running

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 8.006873632s
functional_test_pvc_test.go:114: (dbg) Run:  kubectl --context functional-105100 exec sp-pod -- ls /tmp/mount
--- PASS: TestFunctional/parallel/PersistentVolumeClaim (28.64s)

                                                
                                    
x
+
TestFunctional/parallel/SSHCmd (0.34s)

                                                
                                                
=== RUN   TestFunctional/parallel/SSHCmd
=== PAUSE TestFunctional/parallel/SSHCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/SSHCmd
functional_test.go:1655: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 ssh "echo hello"

                                                
                                                
=== CONT  TestFunctional/parallel/SSHCmd
functional_test.go:1672: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 ssh "cat /etc/hostname"
--- PASS: TestFunctional/parallel/SSHCmd (0.34s)

                                                
                                    
x
+
TestFunctional/parallel/CpCmd (0.63s)

                                                
                                                
=== RUN   TestFunctional/parallel/CpCmd
=== PAUSE TestFunctional/parallel/CpCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CpCmd

                                                
                                                
=== CONT  TestFunctional/parallel/CpCmd
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 cp testdata/cp-test.txt /home/docker/cp-test.txt

                                                
                                                
=== CONT  TestFunctional/parallel/CpCmd
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 ssh -n functional-105100 "sudo cat /home/docker/cp-test.txt"

                                                
                                                
=== CONT  TestFunctional/parallel/CpCmd
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 cp functional-105100:/home/docker/cp-test.txt /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelCpCmd285773991/001/cp-test.txt

                                                
                                                
=== CONT  TestFunctional/parallel/CpCmd
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 ssh -n functional-105100 "sudo cat /home/docker/cp-test.txt"
--- PASS: TestFunctional/parallel/CpCmd (0.63s)

                                                
                                    
x
+
TestFunctional/parallel/MySQL (22.29s)

                                                
                                                
=== RUN   TestFunctional/parallel/MySQL
=== PAUSE TestFunctional/parallel/MySQL

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/MySQL
functional_test.go:1720: (dbg) Run:  kubectl --context functional-105100 replace --force -f testdata/mysql.yaml
functional_test.go:1726: (dbg) TestFunctional/parallel/MySQL: waiting 10m0s for pods matching "app=mysql" in namespace "default" ...
helpers_test.go:342: "mysql-596b7fcdbf-n6qcj" [4ffb745c-1498-46ed-820a-a2ca88bd99eb] Pending
helpers_test.go:342: "mysql-596b7fcdbf-n6qcj" [4ffb745c-1498-46ed-820a-a2ca88bd99eb] Pending / Ready:ContainersNotReady (containers with unready status: [mysql]) / ContainersReady:ContainersNotReady (containers with unready status: [mysql])

                                                
                                                
=== CONT  TestFunctional/parallel/MySQL
helpers_test.go:342: "mysql-596b7fcdbf-n6qcj" [4ffb745c-1498-46ed-820a-a2ca88bd99eb] Running

                                                
                                                
=== CONT  TestFunctional/parallel/MySQL
functional_test.go:1726: (dbg) TestFunctional/parallel/MySQL: app=mysql healthy within 19.009643266s
functional_test.go:1734: (dbg) Run:  kubectl --context functional-105100 exec mysql-596b7fcdbf-n6qcj -- mysql -ppassword -e "show databases;"
functional_test.go:1734: (dbg) Non-zero exit: kubectl --context functional-105100 exec mysql-596b7fcdbf-n6qcj -- mysql -ppassword -e "show databases;": exit status 1 (113.37781ms)

                                                
                                                
** stderr ** 
	mysql: [Warning] Using a password on the command line interface can be insecure.
	ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)
	command terminated with exit code 1

                                                
                                                
** /stderr **
functional_test.go:1734: (dbg) Run:  kubectl --context functional-105100 exec mysql-596b7fcdbf-n6qcj -- mysql -ppassword -e "show databases;"
functional_test.go:1734: (dbg) Non-zero exit: kubectl --context functional-105100 exec mysql-596b7fcdbf-n6qcj -- mysql -ppassword -e "show databases;": exit status 1 (115.070147ms)

                                                
                                                
** stderr ** 
	mysql: [Warning] Using a password on the command line interface can be insecure.
	ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)
	command terminated with exit code 1

                                                
                                                
** /stderr **
functional_test.go:1734: (dbg) Run:  kubectl --context functional-105100 exec mysql-596b7fcdbf-n6qcj -- mysql -ppassword -e "show databases;"
--- PASS: TestFunctional/parallel/MySQL (22.29s)

                                                
                                    
x
+
TestFunctional/parallel/FileSync (0.16s)

                                                
                                                
=== RUN   TestFunctional/parallel/FileSync
=== PAUSE TestFunctional/parallel/FileSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/FileSync
functional_test.go:1856: Checking for existence of /etc/test/nested/copy/15823/hosts within VM
functional_test.go:1858: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 ssh "sudo cat /etc/test/nested/copy/15823/hosts"
functional_test.go:1863: file sync test content: Test file for checking file sync process
--- PASS: TestFunctional/parallel/FileSync (0.16s)

                                                
                                    
x
+
TestFunctional/parallel/CertSync (1.05s)

                                                
                                                
=== RUN   TestFunctional/parallel/CertSync
=== PAUSE TestFunctional/parallel/CertSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CertSync
functional_test.go:1899: Checking for existence of /etc/ssl/certs/15823.pem within VM
functional_test.go:1900: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 ssh "sudo cat /etc/ssl/certs/15823.pem"
functional_test.go:1899: Checking for existence of /usr/share/ca-certificates/15823.pem within VM
functional_test.go:1900: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 ssh "sudo cat /usr/share/ca-certificates/15823.pem"

                                                
                                                
=== CONT  TestFunctional/parallel/CertSync
functional_test.go:1899: Checking for existence of /etc/ssl/certs/51391683.0 within VM
functional_test.go:1900: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 ssh "sudo cat /etc/ssl/certs/51391683.0"

                                                
                                                
=== CONT  TestFunctional/parallel/CertSync
functional_test.go:1926: Checking for existence of /etc/ssl/certs/158232.pem within VM
functional_test.go:1927: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 ssh "sudo cat /etc/ssl/certs/158232.pem"
functional_test.go:1926: Checking for existence of /usr/share/ca-certificates/158232.pem within VM
functional_test.go:1927: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 ssh "sudo cat /usr/share/ca-certificates/158232.pem"
functional_test.go:1926: Checking for existence of /etc/ssl/certs/3ec20f2e.0 within VM
functional_test.go:1927: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 ssh "sudo cat /etc/ssl/certs/3ec20f2e.0"
--- PASS: TestFunctional/parallel/CertSync (1.05s)

                                                
                                    
x
+
TestFunctional/parallel/NodeLabels (0.06s)

                                                
                                                
=== RUN   TestFunctional/parallel/NodeLabels
=== PAUSE TestFunctional/parallel/NodeLabels

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NodeLabels
functional_test.go:215: (dbg) Run:  kubectl --context functional-105100 get nodes --output=go-template "--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'"
--- PASS: TestFunctional/parallel/NodeLabels (0.06s)

                                                
                                    
x
+
TestFunctional/parallel/NonActiveRuntimeDisabled (0.2s)

                                                
                                                
=== RUN   TestFunctional/parallel/NonActiveRuntimeDisabled
=== PAUSE TestFunctional/parallel/NonActiveRuntimeDisabled

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NonActiveRuntimeDisabled
functional_test.go:1954: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 ssh "sudo systemctl is-active crio"
functional_test.go:1954: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-105100 ssh "sudo systemctl is-active crio": exit status 1 (200.03992ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/NonActiveRuntimeDisabled (0.20s)

                                                
                                    
x
+
TestFunctional/parallel/License (0.46s)

                                                
                                                
=== RUN   TestFunctional/parallel/License
=== PAUSE TestFunctional/parallel/License

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/License
functional_test.go:2215: (dbg) Run:  out/minikube-darwin-amd64 license
--- PASS: TestFunctional/parallel/License (0.46s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/StartTunnel
functional_test_tunnel_test.go:127: (dbg) daemon: [out/minikube-darwin-amd64 -p functional-105100 tunnel --alsologtostderr]
--- PASS: TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (11.15s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup
functional_test_tunnel_test.go:147: (dbg) Run:  kubectl --context functional-105100 apply -f testdata/testsvc.yaml
functional_test_tunnel_test.go:151: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: waiting 4m0s for pods matching "run=nginx-svc" in namespace "default" ...
helpers_test.go:342: "nginx-svc" [5f21d27c-191e-4dc1-9659-e16d4ed7750b] Pending

                                                
                                                
=== CONT  TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup
helpers_test.go:342: "nginx-svc" [5f21d27c-191e-4dc1-9659-e16d4ed7750b] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:342: "nginx-svc" [5f21d27c-191e-4dc1-9659-e16d4ed7750b] Running

                                                
                                                
=== CONT  TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup
functional_test_tunnel_test.go:151: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: run=nginx-svc healthy within 11.013692771s
--- PASS: TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (11.15s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP (0.05s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP
functional_test_tunnel_test.go:169: (dbg) Run:  kubectl --context functional-105100 get svc nginx-svc -o jsonpath={.status.loadBalancer.ingress[0].ip}
--- PASS: TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP (0.05s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessDirect
functional_test_tunnel_test.go:234: tunnel at http://10.103.84.213 is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig
functional_test_tunnel_test.go:254: (dbg) Run:  dig +time=5 +tries=3 @10.96.0.10 nginx-svc.default.svc.cluster.local. A
functional_test_tunnel_test.go:262: DNS resolution by dig for nginx-svc.default.svc.cluster.local. is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil
functional_test_tunnel_test.go:286: (dbg) Run:  dscacheutil -q host -a name nginx-svc.default.svc.cluster.local.
functional_test_tunnel_test.go:294: DNS resolution by dscacheutil for nginx-svc.default.svc.cluster.local. is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS
functional_test_tunnel_test.go:359: tunnel at http://nginx-svc.default.svc.cluster.local. is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.13s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel
functional_test_tunnel_test.go:369: (dbg) stopping [out/minikube-darwin-amd64 -p functional-105100 tunnel --alsologtostderr] ...
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.13s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_not_create (0.32s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_not_create
functional_test.go:1266: (dbg) Run:  out/minikube-darwin-amd64 profile lis
functional_test.go:1271: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestFunctional/parallel/ProfileCmd/profile_not_create (0.32s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_list (0.29s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_list
functional_test.go:1306: (dbg) Run:  out/minikube-darwin-amd64 profile list
functional_test.go:1311: Took "210.340887ms" to run "out/minikube-darwin-amd64 profile list"
functional_test.go:1320: (dbg) Run:  out/minikube-darwin-amd64 profile list -l
functional_test.go:1325: Took "81.252821ms" to run "out/minikube-darwin-amd64 profile list -l"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_list (0.29s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_json_output (0.29s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_json_output
functional_test.go:1357: (dbg) Run:  out/minikube-darwin-amd64 profile list -o json
functional_test.go:1362: Took "207.209192ms" to run "out/minikube-darwin-amd64 profile list -o json"
functional_test.go:1370: (dbg) Run:  out/minikube-darwin-amd64 profile list -o json --light
functional_test.go:1375: Took "80.813901ms" to run "out/minikube-darwin-amd64 profile list -o json --light"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_json_output (0.29s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/any-port (12.29s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/any-port
functional_test_mount_test.go:66: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-105100 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdany-port1813203532/001:/mount-9p --alsologtostderr -v=1]
functional_test_mount_test.go:100: wrote "test-1669661663597418000" to /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdany-port1813203532/001/created-by-test
functional_test_mount_test.go:100: wrote "test-1669661663597418000" to /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdany-port1813203532/001/created-by-test-removed-by-pod
functional_test_mount_test.go:100: wrote "test-1669661663597418000" to /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdany-port1813203532/001/test-1669661663597418000
functional_test_mount_test.go:108: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:108: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-105100 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (160.744686ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:108: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 ssh -- ls -la /mount-9p
functional_test_mount_test.go:126: guest mount directory contents
total 2
-rw-r--r-- 1 docker docker 24 Nov 28 18:54 created-by-test
-rw-r--r-- 1 docker docker 24 Nov 28 18:54 created-by-test-removed-by-pod
-rw-r--r-- 1 docker docker 24 Nov 28 18:54 test-1669661663597418000
functional_test_mount_test.go:130: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 ssh cat /mount-9p/test-1669661663597418000
functional_test_mount_test.go:141: (dbg) Run:  kubectl --context functional-105100 replace --force -f testdata/busybox-mount-test.yaml
functional_test_mount_test.go:146: (dbg) TestFunctional/parallel/MountCmd/any-port: waiting 4m0s for pods matching "integration-test=busybox-mount" in namespace "default" ...
helpers_test.go:342: "busybox-mount" [c93481f7-d540-4634-9aec-e22d75d7b317] Pending
helpers_test.go:342: "busybox-mount" [c93481f7-d540-4634-9aec-e22d75d7b317] Pending / Ready:ContainersNotReady (containers with unready status: [mount-munger]) / ContainersReady:ContainersNotReady (containers with unready status: [mount-munger])

                                                
                                                
=== CONT  TestFunctional/parallel/MountCmd/any-port
helpers_test.go:342: "busybox-mount" [c93481f7-d540-4634-9aec-e22d75d7b317] Pending: Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
helpers_test.go:342: "busybox-mount" [c93481f7-d540-4634-9aec-e22d75d7b317] Succeeded: Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
functional_test_mount_test.go:146: (dbg) TestFunctional/parallel/MountCmd/any-port: integration-test=busybox-mount healthy within 10.007361497s
functional_test_mount_test.go:162: (dbg) Run:  kubectl --context functional-105100 logs busybox-mount
functional_test_mount_test.go:174: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 ssh stat /mount-9p/created-by-test
functional_test_mount_test.go:174: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 ssh stat /mount-9p/created-by-pod
functional_test_mount_test.go:83: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:87: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-105100 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdany-port1813203532/001:/mount-9p --alsologtostderr -v=1] ...
--- PASS: TestFunctional/parallel/MountCmd/any-port (12.29s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/specific-port (1.61s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/specific-port
functional_test_mount_test.go:206: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-105100 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdspecific-port4126225136/001:/mount-9p --alsologtostderr -v=1 --port 46464]
functional_test_mount_test.go:236: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:236: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-105100 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (174.717184ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:236: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:250: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 ssh -- ls -la /mount-9p
functional_test_mount_test.go:254: guest mount directory contents
total 0
functional_test_mount_test.go:256: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-105100 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdspecific-port4126225136/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
functional_test_mount_test.go:257: reading mount text
functional_test_mount_test.go:271: done reading mount text
functional_test_mount_test.go:223: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:223: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-105100 ssh "sudo umount -f /mount-9p": exit status 1 (131.544367ms)

                                                
                                                
-- stdout --
	umount: /mount-9p: not mounted.

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 32

                                                
                                                
** /stderr **
functional_test_mount_test.go:225: "out/minikube-darwin-amd64 -p functional-105100 ssh \"sudo umount -f /mount-9p\"": exit status 1
functional_test_mount_test.go:227: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-105100 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdspecific-port4126225136/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
--- PASS: TestFunctional/parallel/MountCmd/specific-port (1.61s)

                                                
                                    
x
+
TestFunctional/parallel/Version/short (0.14s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/short
=== PAUSE TestFunctional/parallel/Version/short

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/short
functional_test.go:2183: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 version --short
--- PASS: TestFunctional/parallel/Version/short (0.14s)

                                                
                                    
x
+
TestFunctional/parallel/Version/components (0.45s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/components
=== PAUSE TestFunctional/parallel/Version/components

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/components
functional_test.go:2197: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 version -o=json --components
--- PASS: TestFunctional/parallel/Version/components (0.45s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListShort (0.18s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListShort
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListShort

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListShort
functional_test.go:257: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 image ls --format short
functional_test.go:262: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-105100 image ls --format short:
registry.k8s.io/pause:3.8
registry.k8s.io/kube-scheduler:v1.25.3
registry.k8s.io/kube-proxy:v1.25.3
registry.k8s.io/kube-controller-manager:v1.25.3
registry.k8s.io/kube-apiserver:v1.25.3
registry.k8s.io/etcd:3.5.4-0
registry.k8s.io/coredns/coredns:v1.9.3
k8s.gcr.io/pause:latest
k8s.gcr.io/pause:3.6
k8s.gcr.io/pause:3.3
k8s.gcr.io/pause:3.1
k8s.gcr.io/echoserver:1.8
gcr.io/k8s-minikube/storage-provisioner:v5
gcr.io/k8s-minikube/busybox:1.28.4-glibc
gcr.io/google-containers/addon-resizer:functional-105100
docker.io/library/nginx:latest
docker.io/library/nginx:alpine
docker.io/library/mysql:5.7
docker.io/library/minikube-local-cache-test:functional-105100
docker.io/kubernetesui/metrics-scraper:<none>
docker.io/kubernetesui/dashboard:<none>
--- PASS: TestFunctional/parallel/ImageCommands/ImageListShort (0.18s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListTable (0.16s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListTable
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListTable

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListTable
functional_test.go:257: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 image ls --format table
functional_test.go:262: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-105100 image ls --format table:
|---------------------------------------------|-------------------|---------------|--------|
|                    Image                    |        Tag        |   Image ID    |  Size  |
|---------------------------------------------|-------------------|---------------|--------|
| k8s.gcr.io/echoserver                       | 1.8               | 82e4c8a736a4f | 95.4MB |
| k8s.gcr.io/pause                            | latest            | 350b164e7ae1d | 240kB  |
| registry.k8s.io/etcd                        | 3.5.4-0           | a8a176a5d5d69 | 300MB  |
| k8s.gcr.io/pause                            | 3.3               | 0184c1613d929 | 683kB  |
| registry.k8s.io/pause                       | 3.8               | 4873874c08efc | 711kB  |
| registry.k8s.io/coredns/coredns             | v1.9.3            | 5185b96f0becf | 48.8MB |
| docker.io/library/nginx                     | latest            | 88736fe827391 | 142MB  |
| registry.k8s.io/kube-controller-manager     | v1.25.3           | 6039992312758 | 117MB  |
| registry.k8s.io/kube-scheduler              | v1.25.3           | 6d23ec0e8b87e | 50.6MB |
| registry.k8s.io/kube-proxy                  | v1.25.3           | beaaf00edd38a | 61.7MB |
| k8s.gcr.io/pause                            | 3.6               | 6270bb605e12e | 683kB  |
| gcr.io/google-containers/addon-resizer      | functional-105100 | ffd4cfbbe753e | 32.9MB |
| docker.io/library/mysql                     | 5.7               | eef0fab001e8d | 495MB  |
| registry.k8s.io/kube-apiserver              | v1.25.3           | 0346dbd74bcb9 | 128MB  |
| docker.io/kubernetesui/dashboard            | <none>            | 07655ddf2eebe | 246MB  |
| docker.io/kubernetesui/metrics-scraper      | <none>            | 115053965e86b | 43.8MB |
| gcr.io/k8s-minikube/storage-provisioner     | v5                | 6e38f40d628db | 31.5MB |
| gcr.io/k8s-minikube/busybox                 | 1.28.4-glibc      | 56cc512116c8f | 4.4MB  |
| k8s.gcr.io/pause                            | 3.1               | da86e6ba6ca19 | 742kB  |
| docker.io/library/minikube-local-cache-test | functional-105100 | a723684ac14b3 | 30B    |
| docker.io/library/nginx                     | alpine            | 19dd4d73108a1 | 23.5MB |
|---------------------------------------------|-------------------|---------------|--------|
--- PASS: TestFunctional/parallel/ImageCommands/ImageListTable (0.16s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListJson (0.16s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListJson
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListJson

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListJson
functional_test.go:257: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 image ls --format json
functional_test.go:262: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-105100 image ls --format json:
[{"id":"a723684ac14b3de0a120f72cc33dfc54207f653426523059944237a1d4002e81","repoDigests":[],"repoTags":["docker.io/library/minikube-local-cache-test:functional-105100"],"size":"30"},{"id":"07655ddf2eebe5d250f7a72c25f638b27126805d61779741b4e62e69ba080558","repoDigests":[],"repoTags":["docker.io/kubernetesui/dashboard:\u003cnone\u003e"],"size":"246000000"},{"id":"a8a176a5d5d698f9409dc246f81fa69d37d4a2f4132ba5e62e72a78476b27f66","repoDigests":[],"repoTags":["registry.k8s.io/etcd:3.5.4-0"],"size":"300000000"},{"id":"6e38f40d628db3002f5617342c8872c935de530d867d0f709a2fbda1a302a562","repoDigests":[],"repoTags":["gcr.io/k8s-minikube/storage-provisioner:v5"],"size":"31500000"},{"id":"0184c1613d92931126feb4c548e5da11015513b9e4c104e7305ee8b53b50a9da","repoDigests":[],"repoTags":["k8s.gcr.io/pause:3.3"],"size":"683000"},{"id":"60399923127581086e9029f30a0c9e3c88708efa8fc05d22d5e33887e7c0310a","repoDigests":[],"repoTags":["registry.k8s.io/kube-controller-manager:v1.25.3"],"size":"117000000"},{"id":"82e4c8a736a4fcf2
2b5ef9f6a4ff6207064c7187d7694bf97bd561605a538410","repoDigests":[],"repoTags":["k8s.gcr.io/echoserver:1.8"],"size":"95400000"},{"id":"350b164e7ae1dcddeffadd65c76226c9b6dc5553f5179153fb0e36b78f2a5e06","repoDigests":[],"repoTags":["k8s.gcr.io/pause:latest"],"size":"240000"},{"id":"ffd4cfbbe753e62419e129ee2ac618beb94e51baa7471df5038b0b516b59cf91","repoDigests":[],"repoTags":["gcr.io/google-containers/addon-resizer:functional-105100"],"size":"32900000"},{"id":"da86e6ba6ca197bf6bc5e9d900febd906b133eaa4750e6bed647b0fbe50ed43e","repoDigests":[],"repoTags":["k8s.gcr.io/pause:3.1"],"size":"742000"},{"id":"88736fe827391462a4db99252117f136b2b25d1d31719006326a437bb40cb12d","repoDigests":[],"repoTags":["docker.io/library/nginx:latest"],"size":"142000000"},{"id":"19dd4d73108a1feefc29d299f3727467ac02486c83474fc3979e4a7637291fe6","repoDigests":[],"repoTags":["docker.io/library/nginx:alpine"],"size":"23500000"},{"id":"beaaf00edd38a6cb405376588e708084376a6786e722231dc8a1482730e0c041","repoDigests":[],"repoTags":["registry.k8s.
io/kube-proxy:v1.25.3"],"size":"61700000"},{"id":"4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.8"],"size":"711000"},{"id":"5185b96f0becf59032b8e3646e99f84d9655dff3ac9e2605e0dc77f9c441ae4a","repoDigests":[],"repoTags":["registry.k8s.io/coredns/coredns:v1.9.3"],"size":"48800000"},{"id":"6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee","repoDigests":[],"repoTags":["k8s.gcr.io/pause:3.6"],"size":"683000"},{"id":"eef0fab001e8dea739d538688b09e162bf54dd6c2bc04066bff99b5335cd6223","repoDigests":[],"repoTags":["docker.io/library/mysql:5.7"],"size":"495000000"},{"id":"0346dbd74bcb9485bb4da1b33027094d79488470d8d1b9baa4d927db564e4fe0","repoDigests":[],"repoTags":["registry.k8s.io/kube-apiserver:v1.25.3"],"size":"128000000"},{"id":"6d23ec0e8b87eaaa698c3425c2c4d25f7329c587e9b39d967ab3f60048983912","repoDigests":[],"repoTags":["registry.k8s.io/kube-scheduler:v1.25.3"],"size":"50600000"},{"id":"115053965e86b2df4d78af78d7951b8644839d20
a03820c6df59a261103315f7","repoDigests":[],"repoTags":["docker.io/kubernetesui/metrics-scraper:\u003cnone\u003e"],"size":"43800000"},{"id":"56cc512116c8f894f11ce1995460aef1ee0972d48bc2a3bdb1faaac7c020289c","repoDigests":[],"repoTags":["gcr.io/k8s-minikube/busybox:1.28.4-glibc"],"size":"4400000"}]
--- PASS: TestFunctional/parallel/ImageCommands/ImageListJson (0.16s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListYaml (0.23s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListYaml
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListYaml

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListYaml
functional_test.go:257: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 image ls --format yaml
functional_test.go:262: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-105100 image ls --format yaml:
- id: 0346dbd74bcb9485bb4da1b33027094d79488470d8d1b9baa4d927db564e4fe0
repoDigests: []
repoTags:
- registry.k8s.io/kube-apiserver:v1.25.3
size: "128000000"
- id: 115053965e86b2df4d78af78d7951b8644839d20a03820c6df59a261103315f7
repoDigests: []
repoTags:
- docker.io/kubernetesui/metrics-scraper:<none>
size: "43800000"
- id: ffd4cfbbe753e62419e129ee2ac618beb94e51baa7471df5038b0b516b59cf91
repoDigests: []
repoTags:
- gcr.io/google-containers/addon-resizer:functional-105100
size: "32900000"
- id: 82e4c8a736a4fcf22b5ef9f6a4ff6207064c7187d7694bf97bd561605a538410
repoDigests: []
repoTags:
- k8s.gcr.io/echoserver:1.8
size: "95400000"
- id: 6e38f40d628db3002f5617342c8872c935de530d867d0f709a2fbda1a302a562
repoDigests: []
repoTags:
- gcr.io/k8s-minikube/storage-provisioner:v5
size: "31500000"
- id: 350b164e7ae1dcddeffadd65c76226c9b6dc5553f5179153fb0e36b78f2a5e06
repoDigests: []
repoTags:
- k8s.gcr.io/pause:latest
size: "240000"
- id: a723684ac14b3de0a120f72cc33dfc54207f653426523059944237a1d4002e81
repoDigests: []
repoTags:
- docker.io/library/minikube-local-cache-test:functional-105100
size: "30"
- id: 19dd4d73108a1feefc29d299f3727467ac02486c83474fc3979e4a7637291fe6
repoDigests: []
repoTags:
- docker.io/library/nginx:alpine
size: "23500000"
- id: eef0fab001e8dea739d538688b09e162bf54dd6c2bc04066bff99b5335cd6223
repoDigests: []
repoTags:
- docker.io/library/mysql:5.7
size: "495000000"
- id: 60399923127581086e9029f30a0c9e3c88708efa8fc05d22d5e33887e7c0310a
repoDigests: []
repoTags:
- registry.k8s.io/kube-controller-manager:v1.25.3
size: "117000000"
- id: 6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee
repoDigests: []
repoTags:
- k8s.gcr.io/pause:3.6
size: "683000"
- id: 88736fe827391462a4db99252117f136b2b25d1d31719006326a437bb40cb12d
repoDigests: []
repoTags:
- docker.io/library/nginx:latest
size: "142000000"
- id: 6d23ec0e8b87eaaa698c3425c2c4d25f7329c587e9b39d967ab3f60048983912
repoDigests: []
repoTags:
- registry.k8s.io/kube-scheduler:v1.25.3
size: "50600000"
- id: 07655ddf2eebe5d250f7a72c25f638b27126805d61779741b4e62e69ba080558
repoDigests: []
repoTags:
- docker.io/kubernetesui/dashboard:<none>
size: "246000000"
- id: a8a176a5d5d698f9409dc246f81fa69d37d4a2f4132ba5e62e72a78476b27f66
repoDigests: []
repoTags:
- registry.k8s.io/etcd:3.5.4-0
size: "300000000"
- id: 0184c1613d92931126feb4c548e5da11015513b9e4c104e7305ee8b53b50a9da
repoDigests: []
repoTags:
- k8s.gcr.io/pause:3.3
size: "683000"
- id: beaaf00edd38a6cb405376588e708084376a6786e722231dc8a1482730e0c041
repoDigests: []
repoTags:
- registry.k8s.io/kube-proxy:v1.25.3
size: "61700000"
- id: 4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.8
size: "711000"
- id: 5185b96f0becf59032b8e3646e99f84d9655dff3ac9e2605e0dc77f9c441ae4a
repoDigests: []
repoTags:
- registry.k8s.io/coredns/coredns:v1.9.3
size: "48800000"
- id: 56cc512116c8f894f11ce1995460aef1ee0972d48bc2a3bdb1faaac7c020289c
repoDigests: []
repoTags:
- gcr.io/k8s-minikube/busybox:1.28.4-glibc
size: "4400000"
- id: da86e6ba6ca197bf6bc5e9d900febd906b133eaa4750e6bed647b0fbe50ed43e
repoDigests: []
repoTags:
- k8s.gcr.io/pause:3.1
size: "742000"

                                                
                                                
--- PASS: TestFunctional/parallel/ImageCommands/ImageListYaml (0.23s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageBuild (6.7s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageBuild
=== PAUSE TestFunctional/parallel/ImageCommands/ImageBuild

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageBuild
functional_test.go:304: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 ssh pgrep buildkitd
functional_test.go:304: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-105100 ssh pgrep buildkitd: exit status 1 (133.307399ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 image build -t localhost/my-image:functional-105100 testdata/build

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageBuild
functional_test.go:311: (dbg) Done: out/minikube-darwin-amd64 -p functional-105100 image build -t localhost/my-image:functional-105100 testdata/build: (6.417898544s)
functional_test.go:316: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-105100 image build -t localhost/my-image:functional-105100 testdata/build:
Sending build context to Docker daemon  3.072kB

Step 1/3 : FROM gcr.io/k8s-minikube/busybox
latest: Pulling from k8s-minikube/busybox
5cc84ad355aa: Pulling fs layer
5cc84ad355aa: Verifying Checksum
5cc84ad355aa: Download complete
5cc84ad355aa: Pull complete
Digest: sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
Status: Downloaded newer image for gcr.io/k8s-minikube/busybox:latest
---> beae173ccac6
Step 2/3 : RUN true
---> Running in 30f5088a95ce
Removing intermediate container 30f5088a95ce
---> dc22f391d74c
Step 3/3 : ADD content.txt /
---> e59f01a904e2
Successfully built e59f01a904e2
Successfully tagged localhost/my-image:functional-105100
functional_test.go:444: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageBuild (6.70s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/Setup (5.53s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/Setup
functional_test.go:338: (dbg) Run:  docker pull gcr.io/google-containers/addon-resizer:1.8.8
E1128 10:54:42.271466   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
2022/11/28 10:54:42 [DEBUG] GET http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/Setup
functional_test.go:338: (dbg) Done: docker pull gcr.io/google-containers/addon-resizer:1.8.8: (5.466030005s)
functional_test.go:343: (dbg) Run:  docker tag gcr.io/google-containers/addon-resizer:1.8.8 gcr.io/google-containers/addon-resizer:functional-105100
--- PASS: TestFunctional/parallel/ImageCommands/Setup (5.53s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadDaemon (3.23s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadDaemon
functional_test.go:351: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 image load --daemon gcr.io/google-containers/addon-resizer:functional-105100

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageLoadDaemon
functional_test.go:351: (dbg) Done: out/minikube-darwin-amd64 -p functional-105100 image load --daemon gcr.io/google-containers/addon-resizer:functional-105100: (3.065539444s)
functional_test.go:444: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadDaemon (3.23s)

                                                
                                    
x
+
TestFunctional/parallel/DockerEnv/bash (0.74s)

                                                
                                                
=== RUN   TestFunctional/parallel/DockerEnv/bash
functional_test.go:492: (dbg) Run:  /bin/bash -c "eval $(out/minikube-darwin-amd64 -p functional-105100 docker-env) && out/minikube-darwin-amd64 status -p functional-105100"
functional_test.go:515: (dbg) Run:  /bin/bash -c "eval $(out/minikube-darwin-amd64 -p functional-105100 docker-env) && docker images"
--- PASS: TestFunctional/parallel/DockerEnv/bash (0.74s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_changes (0.19s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_changes
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_changes

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_changes
functional_test.go:2046: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_changes (0.19s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.2s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
functional_test.go:2046: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.20s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_clusters (0.21s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_clusters
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_clusters

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_clusters
functional_test.go:2046: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_clusters (0.21s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageReloadDaemon (2.07s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageReloadDaemon
functional_test.go:361: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 image load --daemon gcr.io/google-containers/addon-resizer:functional-105100
functional_test.go:361: (dbg) Done: out/minikube-darwin-amd64 -p functional-105100 image load --daemon gcr.io/google-containers/addon-resizer:functional-105100: (1.877446466s)
functional_test.go:444: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageReloadDaemon (2.07s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (8.73s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon
functional_test.go:231: (dbg) Run:  docker pull gcr.io/google-containers/addon-resizer:1.8.9
functional_test.go:231: (dbg) Done: docker pull gcr.io/google-containers/addon-resizer:1.8.9: (5.506649397s)
functional_test.go:236: (dbg) Run:  docker tag gcr.io/google-containers/addon-resizer:1.8.9 gcr.io/google-containers/addon-resizer:functional-105100
functional_test.go:241: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 image load --daemon gcr.io/google-containers/addon-resizer:functional-105100
functional_test.go:241: (dbg) Done: out/minikube-darwin-amd64 -p functional-105100 image load --daemon gcr.io/google-containers/addon-resizer:functional-105100: (2.978394664s)
functional_test.go:444: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (8.73s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveToFile (0.92s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveToFile
functional_test.go:376: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 image save gcr.io/google-containers/addon-resizer:functional-105100 /Users/jenkins/workspace/addon-resizer-save.tar
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveToFile (0.92s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageRemove (0.36s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageRemove
functional_test.go:388: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 image rm gcr.io/google-containers/addon-resizer:functional-105100
functional_test.go:444: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageRemove (0.36s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadFromFile (1.11s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadFromFile
functional_test.go:405: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 image load /Users/jenkins/workspace/addon-resizer-save.tar

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageLoadFromFile
functional_test.go:444: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadFromFile (1.11s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveDaemon (2s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveDaemon
functional_test.go:415: (dbg) Run:  docker rmi gcr.io/google-containers/addon-resizer:functional-105100
functional_test.go:420: (dbg) Run:  out/minikube-darwin-amd64 -p functional-105100 image save --daemon gcr.io/google-containers/addon-resizer:functional-105100
functional_test.go:420: (dbg) Done: out/minikube-darwin-amd64 -p functional-105100 image save --daemon gcr.io/google-containers/addon-resizer:functional-105100: (1.887044056s)
functional_test.go:425: (dbg) Run:  docker image inspect gcr.io/google-containers/addon-resizer:functional-105100
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveDaemon (2.00s)

                                                
                                    
x
+
TestFunctional/delete_addon-resizer_images (0.15s)

                                                
                                                
=== RUN   TestFunctional/delete_addon-resizer_images
functional_test.go:186: (dbg) Run:  docker rmi -f gcr.io/google-containers/addon-resizer:1.8.8
functional_test.go:186: (dbg) Run:  docker rmi -f gcr.io/google-containers/addon-resizer:functional-105100
--- PASS: TestFunctional/delete_addon-resizer_images (0.15s)

                                                
                                    
x
+
TestFunctional/delete_my-image_image (0.06s)

                                                
                                                
=== RUN   TestFunctional/delete_my-image_image
functional_test.go:194: (dbg) Run:  docker rmi -f localhost/my-image:functional-105100
--- PASS: TestFunctional/delete_my-image_image (0.06s)

                                                
                                    
x
+
TestFunctional/delete_minikube_cached_images (0.06s)

                                                
                                                
=== RUN   TestFunctional/delete_minikube_cached_images
functional_test.go:202: (dbg) Run:  docker rmi -f minikube-local-cache-test:functional-105100
--- PASS: TestFunctional/delete_minikube_cached_images (0.06s)

                                                
                                    
x
+
TestIngressAddonLegacy/StartLegacyK8sCluster (75.01s)

                                                
                                                
=== RUN   TestIngressAddonLegacy/StartLegacyK8sCluster
ingress_addon_legacy_test.go:39: (dbg) Run:  out/minikube-darwin-amd64 start -p ingress-addon-legacy-105515 --kubernetes-version=v1.18.20 --memory=4096 --wait=true --alsologtostderr -v=5 --driver=hyperkit 
E1128 10:56:04.191306   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
ingress_addon_legacy_test.go:39: (dbg) Done: out/minikube-darwin-amd64 start -p ingress-addon-legacy-105515 --kubernetes-version=v1.18.20 --memory=4096 --wait=true --alsologtostderr -v=5 --driver=hyperkit : (1m15.005067255s)
--- PASS: TestIngressAddonLegacy/StartLegacyK8sCluster (75.01s)

                                                
                                    
x
+
TestIngressAddonLegacy/serial/ValidateIngressAddonActivation (14.74s)

                                                
                                                
=== RUN   TestIngressAddonLegacy/serial/ValidateIngressAddonActivation
ingress_addon_legacy_test.go:70: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-105515 addons enable ingress --alsologtostderr -v=5
ingress_addon_legacy_test.go:70: (dbg) Done: out/minikube-darwin-amd64 -p ingress-addon-legacy-105515 addons enable ingress --alsologtostderr -v=5: (14.741569517s)
--- PASS: TestIngressAddonLegacy/serial/ValidateIngressAddonActivation (14.74s)

                                                
                                    
x
+
TestIngressAddonLegacy/serial/ValidateIngressDNSAddonActivation (0.49s)

                                                
                                                
=== RUN   TestIngressAddonLegacy/serial/ValidateIngressDNSAddonActivation
ingress_addon_legacy_test.go:79: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-105515 addons enable ingress-dns --alsologtostderr -v=5
--- PASS: TestIngressAddonLegacy/serial/ValidateIngressDNSAddonActivation (0.49s)

                                                
                                    
x
+
TestIngressAddonLegacy/serial/ValidateIngressAddons (38.1s)

                                                
                                                
=== RUN   TestIngressAddonLegacy/serial/ValidateIngressAddons
addons_test.go:165: (dbg) Run:  kubectl --context ingress-addon-legacy-105515 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s
addons_test.go:165: (dbg) Done: kubectl --context ingress-addon-legacy-105515 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s: (14.995982895s)
addons_test.go:185: (dbg) Run:  kubectl --context ingress-addon-legacy-105515 replace --force -f testdata/nginx-ingress-v1beta1.yaml
addons_test.go:198: (dbg) Run:  kubectl --context ingress-addon-legacy-105515 replace --force -f testdata/nginx-pod-svc.yaml
addons_test.go:203: (dbg) TestIngressAddonLegacy/serial/ValidateIngressAddons: waiting 8m0s for pods matching "run=nginx" in namespace "default" ...
helpers_test.go:342: "nginx" [8e77db8a-c231-4491-8829-1e5db7018518] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:342: "nginx" [8e77db8a-c231-4491-8829-1e5db7018518] Running
addons_test.go:203: (dbg) TestIngressAddonLegacy/serial/ValidateIngressAddons: run=nginx healthy within 9.007859122s
addons_test.go:215: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-105515 ssh "curl -s http://127.0.0.1/ -H 'Host: nginx.example.com'"
addons_test.go:239: (dbg) Run:  kubectl --context ingress-addon-legacy-105515 replace --force -f testdata/ingress-dns-example-v1beta1.yaml
addons_test.go:244: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-105515 ip
addons_test.go:250: (dbg) Run:  nslookup hello-john.test 192.168.64.48
addons_test.go:259: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-105515 addons disable ingress-dns --alsologtostderr -v=1
addons_test.go:259: (dbg) Done: out/minikube-darwin-amd64 -p ingress-addon-legacy-105515 addons disable ingress-dns --alsologtostderr -v=1: (5.974110649s)
addons_test.go:264: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-105515 addons disable ingress --alsologtostderr -v=1
addons_test.go:264: (dbg) Done: out/minikube-darwin-amd64 -p ingress-addon-legacy-105515 addons disable ingress --alsologtostderr -v=1: (7.235678508s)
--- PASS: TestIngressAddonLegacy/serial/ValidateIngressAddons (38.10s)

                                                
                                    
x
+
TestJSONOutput/start/Command (59.22s)

                                                
                                                
=== RUN   TestJSONOutput/start/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 start -p json-output-105729 --output=json --user=testUser --memory=2200 --wait=true --driver=hyperkit 
E1128 10:58:20.338545   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
json_output_test.go:63: (dbg) Done: out/minikube-darwin-amd64 start -p json-output-105729 --output=json --user=testUser --memory=2200 --wait=true --driver=hyperkit : (59.216013196s)
--- PASS: TestJSONOutput/start/Command (59.22s)

                                                
                                    
x
+
TestJSONOutput/start/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/Audit
--- PASS: TestJSONOutput/start/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/start/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/start/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/Command (0.48s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 pause -p json-output-105729 --output=json --user=testUser
--- PASS: TestJSONOutput/pause/Command (0.48s)

                                                
                                    
x
+
TestJSONOutput/pause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Audit
--- PASS: TestJSONOutput/pause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/Command (0.48s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 unpause -p json-output-105729 --output=json --user=testUser
--- PASS: TestJSONOutput/unpause/Command (0.48s)

                                                
                                    
x
+
TestJSONOutput/unpause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Audit
--- PASS: TestJSONOutput/unpause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/Command (8.16s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 stop -p json-output-105729 --output=json --user=testUser
json_output_test.go:63: (dbg) Done: out/minikube-darwin-amd64 stop -p json-output-105729 --output=json --user=testUser: (8.160936863s)
--- PASS: TestJSONOutput/stop/Command (8.16s)

                                                
                                    
x
+
TestJSONOutput/stop/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Audit
--- PASS: TestJSONOutput/stop/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestErrorJSONOutput (0.74s)

                                                
                                                
=== RUN   TestErrorJSONOutput
json_output_test.go:149: (dbg) Run:  out/minikube-darwin-amd64 start -p json-output-error-105838 --memory=2200 --output=json --wait=true --driver=fail
json_output_test.go:149: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p json-output-error-105838 --memory=2200 --output=json --wait=true --driver=fail: exit status 56 (339.922787ms)

                                                
                                                
-- stdout --
	{"specversion":"1.0","id":"87cc49c1-5d54-44c6-b693-01a9212dd03a","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"0","message":"[json-output-error-105838] minikube v1.28.0 on Darwin 13.0.1","name":"Initial Minikube Setup","totalsteps":"19"}}
	{"specversion":"1.0","id":"a327776d-325b-4ab2-ab7e-fdfc6b8a3503","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_LOCATION=15411"}}
	{"specversion":"1.0","id":"96db021f-8881-494d-ab10-027d81c552be","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"KUBECONFIG=/Users/jenkins/minikube-integration/15411-14646/kubeconfig"}}
	{"specversion":"1.0","id":"f8a68016-55c1-4114-a01e-3750fd140d09","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_BIN=out/minikube-darwin-amd64"}}
	{"specversion":"1.0","id":"0787a9b8-fc76-4887-a4ea-15872e79bb3e","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true"}}
	{"specversion":"1.0","id":"b31c824a-21cc-47b4-8fbe-77cb42ab0fdc","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_HOME=/Users/jenkins/minikube-integration/15411-14646/.minikube"}}
	{"specversion":"1.0","id":"24cb0589-0fa5-419e-a981-910e907b8cf8","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.error","datacontenttype":"application/json","data":{"advice":"","exitcode":"56","issues":"","message":"The driver 'fail' is not supported on darwin/amd64","name":"DRV_UNSUPPORTED_OS","url":""}}

                                                
                                                
-- /stdout --
helpers_test.go:175: Cleaning up "json-output-error-105838" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p json-output-error-105838
--- PASS: TestErrorJSONOutput (0.74s)

                                                
                                    
x
+
TestMainNoArgs (0.08s)

                                                
                                                
=== RUN   TestMainNoArgs
main_test.go:68: (dbg) Run:  out/minikube-darwin-amd64
--- PASS: TestMainNoArgs (0.08s)

                                                
                                    
x
+
TestMinikubeProfile (102.2s)

                                                
                                                
=== RUN   TestMinikubeProfile
minikube_profile_test.go:44: (dbg) Run:  out/minikube-darwin-amd64 start -p first-105839 --driver=hyperkit 
E1128 10:58:48.031998   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
E1128 10:58:53.872946   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
E1128 10:58:53.879075   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
E1128 10:58:53.889206   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
E1128 10:58:53.909334   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
E1128 10:58:53.949612   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
E1128 10:58:54.030312   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
E1128 10:58:54.190640   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
E1128 10:58:54.510856   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
E1128 10:58:55.152062   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
E1128 10:58:56.432729   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
E1128 10:58:58.993264   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
E1128 10:59:04.115013   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
E1128 10:59:14.355312   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
minikube_profile_test.go:44: (dbg) Done: out/minikube-darwin-amd64 start -p first-105839 --driver=hyperkit : (45.195091603s)
minikube_profile_test.go:44: (dbg) Run:  out/minikube-darwin-amd64 start -p second-105839 --driver=hyperkit 
E1128 10:59:34.837340   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
minikube_profile_test.go:44: (dbg) Done: out/minikube-darwin-amd64 start -p second-105839 --driver=hyperkit : (49.160464071s)
minikube_profile_test.go:51: (dbg) Run:  out/minikube-darwin-amd64 profile first-105839
minikube_profile_test.go:55: (dbg) Run:  out/minikube-darwin-amd64 profile list -ojson
minikube_profile_test.go:51: (dbg) Run:  out/minikube-darwin-amd64 profile second-105839
minikube_profile_test.go:55: (dbg) Run:  out/minikube-darwin-amd64 profile list -ojson
helpers_test.go:175: Cleaning up "second-105839" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p second-105839
E1128 11:00:15.798217   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p second-105839: (3.445701905s)
helpers_test.go:175: Cleaning up "first-105839" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p first-105839
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p first-105839: (3.454794762s)
--- PASS: TestMinikubeProfile (102.20s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountFirst (17.37s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountFirst
mount_start_test.go:98: (dbg) Run:  out/minikube-darwin-amd64 start -p mount-start-1-110021 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=hyperkit 
mount_start_test.go:98: (dbg) Done: out/minikube-darwin-amd64 start -p mount-start-1-110021 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=hyperkit : (16.372548588s)
--- PASS: TestMountStart/serial/StartWithMountFirst (17.37s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountFirst (0.31s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountFirst
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-1-110021 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-1-110021 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountFirst (0.31s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountSecond (17.17s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountSecond
mount_start_test.go:98: (dbg) Run:  out/minikube-darwin-amd64 start -p mount-start-2-110021 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=hyperkit 
mount_start_test.go:98: (dbg) Done: out/minikube-darwin-amd64 start -p mount-start-2-110021 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=hyperkit : (16.164136572s)
--- PASS: TestMountStart/serial/StartWithMountSecond (17.17s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountSecond (0.29s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountSecond
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-110021 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-110021 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountSecond (0.29s)

                                                
                                    
x
+
TestMountStart/serial/DeleteFirst (2.39s)

                                                
                                                
=== RUN   TestMountStart/serial/DeleteFirst
pause_test.go:132: (dbg) Run:  out/minikube-darwin-amd64 delete -p mount-start-1-110021 --alsologtostderr -v=5
pause_test.go:132: (dbg) Done: out/minikube-darwin-amd64 delete -p mount-start-1-110021 --alsologtostderr -v=5: (2.394535269s)
--- PASS: TestMountStart/serial/DeleteFirst (2.39s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostDelete (0.28s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostDelete
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-110021 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-110021 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountPostDelete (0.28s)

                                                
                                    
x
+
TestMountStart/serial/Stop (2.24s)

                                                
                                                
=== RUN   TestMountStart/serial/Stop
mount_start_test.go:155: (dbg) Run:  out/minikube-darwin-amd64 stop -p mount-start-2-110021
mount_start_test.go:155: (dbg) Done: out/minikube-darwin-amd64 stop -p mount-start-2-110021: (2.242162884s)
--- PASS: TestMountStart/serial/Stop (2.24s)

                                                
                                    
x
+
TestMountStart/serial/RestartStopped (16.63s)

                                                
                                                
=== RUN   TestMountStart/serial/RestartStopped
mount_start_test.go:166: (dbg) Run:  out/minikube-darwin-amd64 start -p mount-start-2-110021
mount_start_test.go:166: (dbg) Done: out/minikube-darwin-amd64 start -p mount-start-2-110021: (15.625677897s)
--- PASS: TestMountStart/serial/RestartStopped (16.63s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostStop (0.31s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostStop
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-110021 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-110021 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountPostStop (0.31s)

                                                
                                    
x
+
TestMultiNode/serial/FreshStart2Nodes (145.42s)

                                                
                                                
=== RUN   TestMultiNode/serial/FreshStart2Nodes
multinode_test.go:83: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-110121 --wait=true --memory=2200 --nodes=2 -v=8 --alsologtostderr --driver=hyperkit 
E1128 11:01:37.719378   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
E1128 11:01:46.156601   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
E1128 11:01:46.162549   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
E1128 11:01:46.172967   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
E1128 11:01:46.194428   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
E1128 11:01:46.234897   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
E1128 11:01:46.315807   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
E1128 11:01:46.476561   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
E1128 11:01:46.798581   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
E1128 11:01:47.440145   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
E1128 11:01:48.720249   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
E1128 11:01:51.281306   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
E1128 11:01:56.402259   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
E1128 11:02:06.642642   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
E1128 11:02:27.124731   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
E1128 11:03:08.085418   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
E1128 11:03:20.335837   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
multinode_test.go:83: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-110121 --wait=true --memory=2200 --nodes=2 -v=8 --alsologtostderr --driver=hyperkit : (2m25.186568703s)
multinode_test.go:89: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 status --alsologtostderr
--- PASS: TestMultiNode/serial/FreshStart2Nodes (145.42s)

                                                
                                    
x
+
TestMultiNode/serial/DeployApp2Nodes (9.6s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeployApp2Nodes
multinode_test.go:479: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-110121 -- apply -f ./testdata/multinodes/multinode-pod-dns-test.yaml
multinode_test.go:484: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-110121 -- rollout status deployment/busybox
E1128 11:03:53.871166   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
multinode_test.go:484: (dbg) Done: out/minikube-darwin-amd64 kubectl -p multinode-110121 -- rollout status deployment/busybox: (7.86613023s)
multinode_test.go:490: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-110121 -- get pods -o jsonpath='{.items[*].status.podIP}'
multinode_test.go:502: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-110121 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:510: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-110121 -- exec busybox-65db55d5d6-6c5vx -- nslookup kubernetes.io
multinode_test.go:510: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-110121 -- exec busybox-65db55d5d6-btrbk -- nslookup kubernetes.io
multinode_test.go:520: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-110121 -- exec busybox-65db55d5d6-6c5vx -- nslookup kubernetes.default
multinode_test.go:520: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-110121 -- exec busybox-65db55d5d6-btrbk -- nslookup kubernetes.default
multinode_test.go:528: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-110121 -- exec busybox-65db55d5d6-6c5vx -- nslookup kubernetes.default.svc.cluster.local
multinode_test.go:528: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-110121 -- exec busybox-65db55d5d6-btrbk -- nslookup kubernetes.default.svc.cluster.local
--- PASS: TestMultiNode/serial/DeployApp2Nodes (9.60s)

                                                
                                    
x
+
TestMultiNode/serial/PingHostFrom2Pods (0.86s)

                                                
                                                
=== RUN   TestMultiNode/serial/PingHostFrom2Pods
multinode_test.go:538: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-110121 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:546: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-110121 -- exec busybox-65db55d5d6-6c5vx -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-110121 -- exec busybox-65db55d5d6-6c5vx -- sh -c "ping -c 1 192.168.64.1"
multinode_test.go:546: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-110121 -- exec busybox-65db55d5d6-btrbk -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-110121 -- exec busybox-65db55d5d6-btrbk -- sh -c "ping -c 1 192.168.64.1"
--- PASS: TestMultiNode/serial/PingHostFrom2Pods (0.86s)

                                                
                                    
x
+
TestMultiNode/serial/AddNode (44.17s)

                                                
                                                
=== RUN   TestMultiNode/serial/AddNode
multinode_test.go:108: (dbg) Run:  out/minikube-darwin-amd64 node add -p multinode-110121 -v 3 --alsologtostderr
E1128 11:04:21.558204   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
E1128 11:04:30.005672   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
multinode_test.go:108: (dbg) Done: out/minikube-darwin-amd64 node add -p multinode-110121 -v 3 --alsologtostderr: (43.850948653s)
multinode_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 status --alsologtostderr
--- PASS: TestMultiNode/serial/AddNode (44.17s)

                                                
                                    
x
+
TestMultiNode/serial/ProfileList (0.21s)

                                                
                                                
=== RUN   TestMultiNode/serial/ProfileList
multinode_test.go:130: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestMultiNode/serial/ProfileList (0.21s)

                                                
                                    
x
+
TestMultiNode/serial/CopyFile (5.36s)

                                                
                                                
=== RUN   TestMultiNode/serial/CopyFile
multinode_test.go:171: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 status --output json --alsologtostderr
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 cp testdata/cp-test.txt multinode-110121:/home/docker/cp-test.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 ssh -n multinode-110121 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 cp multinode-110121:/home/docker/cp-test.txt /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestMultiNodeserialCopyFile717615996/001/cp-test_multinode-110121.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 ssh -n multinode-110121 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 cp multinode-110121:/home/docker/cp-test.txt multinode-110121-m02:/home/docker/cp-test_multinode-110121_multinode-110121-m02.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 ssh -n multinode-110121 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 ssh -n multinode-110121-m02 "sudo cat /home/docker/cp-test_multinode-110121_multinode-110121-m02.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 cp multinode-110121:/home/docker/cp-test.txt multinode-110121-m03:/home/docker/cp-test_multinode-110121_multinode-110121-m03.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 ssh -n multinode-110121 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 ssh -n multinode-110121-m03 "sudo cat /home/docker/cp-test_multinode-110121_multinode-110121-m03.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 cp testdata/cp-test.txt multinode-110121-m02:/home/docker/cp-test.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 ssh -n multinode-110121-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 cp multinode-110121-m02:/home/docker/cp-test.txt /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestMultiNodeserialCopyFile717615996/001/cp-test_multinode-110121-m02.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 ssh -n multinode-110121-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 cp multinode-110121-m02:/home/docker/cp-test.txt multinode-110121:/home/docker/cp-test_multinode-110121-m02_multinode-110121.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 ssh -n multinode-110121-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 ssh -n multinode-110121 "sudo cat /home/docker/cp-test_multinode-110121-m02_multinode-110121.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 cp multinode-110121-m02:/home/docker/cp-test.txt multinode-110121-m03:/home/docker/cp-test_multinode-110121-m02_multinode-110121-m03.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 ssh -n multinode-110121-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 ssh -n multinode-110121-m03 "sudo cat /home/docker/cp-test_multinode-110121-m02_multinode-110121-m03.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 cp testdata/cp-test.txt multinode-110121-m03:/home/docker/cp-test.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 ssh -n multinode-110121-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 cp multinode-110121-m03:/home/docker/cp-test.txt /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestMultiNodeserialCopyFile717615996/001/cp-test_multinode-110121-m03.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 ssh -n multinode-110121-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 cp multinode-110121-m03:/home/docker/cp-test.txt multinode-110121:/home/docker/cp-test_multinode-110121-m03_multinode-110121.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 ssh -n multinode-110121-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 ssh -n multinode-110121 "sudo cat /home/docker/cp-test_multinode-110121-m03_multinode-110121.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 cp multinode-110121-m03:/home/docker/cp-test.txt multinode-110121-m02:/home/docker/cp-test_multinode-110121-m03_multinode-110121-m02.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 ssh -n multinode-110121-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 ssh -n multinode-110121-m02 "sudo cat /home/docker/cp-test_multinode-110121-m03_multinode-110121-m02.txt"
--- PASS: TestMultiNode/serial/CopyFile (5.36s)

                                                
                                    
x
+
TestMultiNode/serial/StopNode (2.67s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopNode
multinode_test.go:208: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 node stop m03
multinode_test.go:208: (dbg) Done: out/minikube-darwin-amd64 -p multinode-110121 node stop m03: (2.188071237s)
multinode_test.go:214: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 status
multinode_test.go:214: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-110121 status: exit status 7 (245.982814ms)

                                                
                                                
-- stdout --
	multinode-110121
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-110121-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-110121-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:221: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 status --alsologtostderr
multinode_test.go:221: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-110121 status --alsologtostderr: exit status 7 (239.458036ms)

                                                
                                                
-- stdout --
	multinode-110121
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-110121-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-110121-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1128 11:04:49.153569   19531 out.go:296] Setting OutFile to fd 1 ...
	I1128 11:04:49.153745   19531 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1128 11:04:49.153750   19531 out.go:309] Setting ErrFile to fd 2...
	I1128 11:04:49.153754   19531 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1128 11:04:49.153893   19531 root.go:334] Updating PATH: /Users/jenkins/minikube-integration/15411-14646/.minikube/bin
	I1128 11:04:49.154104   19531 out.go:303] Setting JSON to false
	I1128 11:04:49.154129   19531 mustload.go:65] Loading cluster: multinode-110121
	I1128 11:04:49.154179   19531 notify.go:220] Checking for updates...
	I1128 11:04:49.154458   19531 config.go:180] Loaded profile config "multinode-110121": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.3
	I1128 11:04:49.154474   19531 status.go:255] checking status of multinode-110121 ...
	I1128 11:04:49.154839   19531 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:04:49.154898   19531 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 11:04:49.161589   19531 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:57021
	I1128 11:04:49.161935   19531 main.go:134] libmachine: () Calling .GetVersion
	I1128 11:04:49.162331   19531 main.go:134] libmachine: Using API Version  1
	I1128 11:04:49.162347   19531 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 11:04:49.162545   19531 main.go:134] libmachine: () Calling .GetMachineName
	I1128 11:04:49.162632   19531 main.go:134] libmachine: (multinode-110121) Calling .GetState
	I1128 11:04:49.162699   19531 main.go:134] libmachine: (multinode-110121) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:04:49.162771   19531 main.go:134] libmachine: (multinode-110121) DBG | hyperkit pid from json: 18964
	I1128 11:04:49.163868   19531 status.go:330] multinode-110121 host status = "Running" (err=<nil>)
	I1128 11:04:49.163883   19531 host.go:66] Checking if "multinode-110121" exists ...
	I1128 11:04:49.164132   19531 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:04:49.164151   19531 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 11:04:49.171160   19531 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:57023
	I1128 11:04:49.171528   19531 main.go:134] libmachine: () Calling .GetVersion
	I1128 11:04:49.171847   19531 main.go:134] libmachine: Using API Version  1
	I1128 11:04:49.171859   19531 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 11:04:49.172111   19531 main.go:134] libmachine: () Calling .GetMachineName
	I1128 11:04:49.172216   19531 main.go:134] libmachine: (multinode-110121) Calling .GetIP
	I1128 11:04:49.172287   19531 host.go:66] Checking if "multinode-110121" exists ...
	I1128 11:04:49.172560   19531 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:04:49.172587   19531 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 11:04:49.179298   19531 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:57025
	I1128 11:04:49.179681   19531 main.go:134] libmachine: () Calling .GetVersion
	I1128 11:04:49.179994   19531 main.go:134] libmachine: Using API Version  1
	I1128 11:04:49.180003   19531 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 11:04:49.180198   19531 main.go:134] libmachine: () Calling .GetMachineName
	I1128 11:04:49.180290   19531 main.go:134] libmachine: (multinode-110121) Calling .DriverName
	I1128 11:04:49.180439   19531 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1128 11:04:49.180461   19531 main.go:134] libmachine: (multinode-110121) Calling .GetSSHHostname
	I1128 11:04:49.180543   19531 main.go:134] libmachine: (multinode-110121) Calling .GetSSHPort
	I1128 11:04:49.180625   19531 main.go:134] libmachine: (multinode-110121) Calling .GetSSHKeyPath
	I1128 11:04:49.180699   19531 main.go:134] libmachine: (multinode-110121) Calling .GetSSHUsername
	I1128 11:04:49.180781   19531 sshutil.go:53] new ssh client: &{IP:192.168.64.54 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/multinode-110121/id_rsa Username:docker}
	I1128 11:04:49.213034   19531 ssh_runner.go:195] Run: systemctl --version
	I1128 11:04:49.216559   19531 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1128 11:04:49.226404   19531 kubeconfig.go:92] found "multinode-110121" server: "https://192.168.64.54:8443"
	I1128 11:04:49.226423   19531 api_server.go:165] Checking apiserver status ...
	I1128 11:04:49.226465   19531 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1128 11:04:49.235854   19531 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1810/cgroup
	I1128 11:04:49.242497   19531 api_server.go:181] apiserver freezer: "4:freezer:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36dcd5576a88b133fc9dcfa943e5309a.slice/docker-1ab06221b0a919d91119780c7e87280a1be03f96b0a218c6bee1ceb3b37c1c28.scope"
	I1128 11:04:49.242549   19531 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36dcd5576a88b133fc9dcfa943e5309a.slice/docker-1ab06221b0a919d91119780c7e87280a1be03f96b0a218c6bee1ceb3b37c1c28.scope/freezer.state
	I1128 11:04:49.249280   19531 api_server.go:203] freezer state: "THAWED"
	I1128 11:04:49.249298   19531 api_server.go:252] Checking apiserver healthz at https://192.168.64.54:8443/healthz ...
	I1128 11:04:49.253114   19531 api_server.go:278] https://192.168.64.54:8443/healthz returned 200:
	ok
	I1128 11:04:49.253127   19531 status.go:421] multinode-110121 apiserver status = Running (err=<nil>)
	I1128 11:04:49.253136   19531 status.go:257] multinode-110121 status: &{Name:multinode-110121 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1128 11:04:49.253149   19531 status.go:255] checking status of multinode-110121-m02 ...
	I1128 11:04:49.253421   19531 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:04:49.253441   19531 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 11:04:49.260425   19531 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:57029
	I1128 11:04:49.260803   19531 main.go:134] libmachine: () Calling .GetVersion
	I1128 11:04:49.261124   19531 main.go:134] libmachine: Using API Version  1
	I1128 11:04:49.261134   19531 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 11:04:49.261308   19531 main.go:134] libmachine: () Calling .GetMachineName
	I1128 11:04:49.261416   19531 main.go:134] libmachine: (multinode-110121-m02) Calling .GetState
	I1128 11:04:49.261500   19531 main.go:134] libmachine: (multinode-110121-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:04:49.261574   19531 main.go:134] libmachine: (multinode-110121-m02) DBG | hyperkit pid from json: 19089
	I1128 11:04:49.262644   19531 status.go:330] multinode-110121-m02 host status = "Running" (err=<nil>)
	I1128 11:04:49.262653   19531 host.go:66] Checking if "multinode-110121-m02" exists ...
	I1128 11:04:49.262910   19531 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:04:49.262936   19531 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 11:04:49.269672   19531 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:57031
	I1128 11:04:49.270039   19531 main.go:134] libmachine: () Calling .GetVersion
	I1128 11:04:49.270351   19531 main.go:134] libmachine: Using API Version  1
	I1128 11:04:49.270362   19531 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 11:04:49.270553   19531 main.go:134] libmachine: () Calling .GetMachineName
	I1128 11:04:49.270657   19531 main.go:134] libmachine: (multinode-110121-m02) Calling .GetIP
	I1128 11:04:49.270734   19531 host.go:66] Checking if "multinode-110121-m02" exists ...
	I1128 11:04:49.271008   19531 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:04:49.271029   19531 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 11:04:49.277735   19531 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:57033
	I1128 11:04:49.278128   19531 main.go:134] libmachine: () Calling .GetVersion
	I1128 11:04:49.278503   19531 main.go:134] libmachine: Using API Version  1
	I1128 11:04:49.278517   19531 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 11:04:49.278738   19531 main.go:134] libmachine: () Calling .GetMachineName
	I1128 11:04:49.278845   19531 main.go:134] libmachine: (multinode-110121-m02) Calling .DriverName
	I1128 11:04:49.278979   19531 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1128 11:04:49.278990   19531 main.go:134] libmachine: (multinode-110121-m02) Calling .GetSSHHostname
	I1128 11:04:49.279069   19531 main.go:134] libmachine: (multinode-110121-m02) Calling .GetSSHPort
	I1128 11:04:49.279206   19531 main.go:134] libmachine: (multinode-110121-m02) Calling .GetSSHKeyPath
	I1128 11:04:49.279306   19531 main.go:134] libmachine: (multinode-110121-m02) Calling .GetSSHUsername
	I1128 11:04:49.279384   19531 sshutil.go:53] new ssh client: &{IP:192.168.64.55 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/15411-14646/.minikube/machines/multinode-110121-m02/id_rsa Username:docker}
	I1128 11:04:49.319037   19531 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1128 11:04:49.326999   19531 status.go:257] multinode-110121-m02 status: &{Name:multinode-110121-m02 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}
	I1128 11:04:49.327018   19531 status.go:255] checking status of multinode-110121-m03 ...
	I1128 11:04:49.327313   19531 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:04:49.327335   19531 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 11:04:49.334292   19531 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:57036
	I1128 11:04:49.334694   19531 main.go:134] libmachine: () Calling .GetVersion
	I1128 11:04:49.335047   19531 main.go:134] libmachine: Using API Version  1
	I1128 11:04:49.335062   19531 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 11:04:49.335251   19531 main.go:134] libmachine: () Calling .GetMachineName
	I1128 11:04:49.335352   19531 main.go:134] libmachine: (multinode-110121-m03) Calling .GetState
	I1128 11:04:49.335430   19531 main.go:134] libmachine: (multinode-110121-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:04:49.335498   19531 main.go:134] libmachine: (multinode-110121-m03) DBG | hyperkit pid from json: 19269
	I1128 11:04:49.336541   19531 main.go:134] libmachine: (multinode-110121-m03) DBG | hyperkit pid 19269 missing from process table
	I1128 11:04:49.336575   19531 status.go:330] multinode-110121-m03 host status = "Stopped" (err=<nil>)
	I1128 11:04:49.336583   19531 status.go:343] host is not running, skipping remaining checks
	I1128 11:04:49.336594   19531 status.go:257] multinode-110121-m03 status: &{Name:multinode-110121-m03 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopNode (2.67s)

                                                
                                    
x
+
TestMultiNode/serial/StartAfterStop (30.67s)

                                                
                                                
=== RUN   TestMultiNode/serial/StartAfterStop
multinode_test.go:252: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 node start m03 --alsologtostderr
multinode_test.go:252: (dbg) Done: out/minikube-darwin-amd64 -p multinode-110121 node start m03 --alsologtostderr: (30.321598564s)
multinode_test.go:259: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 status
multinode_test.go:273: (dbg) Run:  kubectl get nodes
--- PASS: TestMultiNode/serial/StartAfterStop (30.67s)

                                                
                                    
x
+
TestMultiNode/serial/RestartKeepsNodes (861.15s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartKeepsNodes
multinode_test.go:281: (dbg) Run:  out/minikube-darwin-amd64 node list -p multinode-110121
multinode_test.go:288: (dbg) Run:  out/minikube-darwin-amd64 stop -p multinode-110121
multinode_test.go:288: (dbg) Done: out/minikube-darwin-amd64 stop -p multinode-110121: (11.349985068s)
multinode_test.go:293: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-110121 --wait=true -v=8 --alsologtostderr
E1128 11:06:46.154446   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
E1128 11:07:13.844832   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
E1128 11:08:20.332293   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
E1128 11:08:53.867353   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
E1128 11:09:43.387196   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
E1128 11:11:46.150742   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
E1128 11:13:20.331440   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
E1128 11:13:53.865774   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
E1128 11:15:16.914558   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
E1128 11:16:46.147231   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
E1128 11:18:09.199878   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
E1128 11:18:20.328638   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
E1128 11:18:53.862060   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
multinode_test.go:293: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-110121 --wait=true -v=8 --alsologtostderr: (14m9.689479212s)
multinode_test.go:298: (dbg) Run:  out/minikube-darwin-amd64 node list -p multinode-110121
--- PASS: TestMultiNode/serial/RestartKeepsNodes (861.15s)

                                                
                                    
x
+
TestMultiNode/serial/DeleteNode (4.96s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeleteNode
multinode_test.go:392: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 node delete m03
multinode_test.go:392: (dbg) Done: out/minikube-darwin-amd64 -p multinode-110121 node delete m03: (4.641663216s)
multinode_test.go:398: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 status --alsologtostderr
multinode_test.go:422: (dbg) Run:  kubectl get nodes
multinode_test.go:430: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/DeleteNode (4.96s)

                                                
                                    
x
+
TestMultiNode/serial/StopMultiNode (4.47s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopMultiNode
multinode_test.go:312: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 stop
multinode_test.go:312: (dbg) Done: out/minikube-darwin-amd64 -p multinode-110121 stop: (4.323545671s)
multinode_test.go:318: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 status
multinode_test.go:318: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-110121 status: exit status 7 (74.076479ms)

                                                
                                                
-- stdout --
	multinode-110121
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-110121-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 status --alsologtostderr
multinode_test.go:325: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-110121 status --alsologtostderr: exit status 7 (74.347943ms)

                                                
                                                
-- stdout --
	multinode-110121
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-110121-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1128 11:19:50.751295   20688 out.go:296] Setting OutFile to fd 1 ...
	I1128 11:19:50.751467   20688 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1128 11:19:50.751472   20688 out.go:309] Setting ErrFile to fd 2...
	I1128 11:19:50.751476   20688 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I1128 11:19:50.751587   20688 root.go:334] Updating PATH: /Users/jenkins/minikube-integration/15411-14646/.minikube/bin
	I1128 11:19:50.751795   20688 out.go:303] Setting JSON to false
	I1128 11:19:50.751818   20688 mustload.go:65] Loading cluster: multinode-110121
	I1128 11:19:50.751848   20688 notify.go:220] Checking for updates...
	I1128 11:19:50.752149   20688 config.go:180] Loaded profile config "multinode-110121": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.3
	I1128 11:19:50.752164   20688 status.go:255] checking status of multinode-110121 ...
	I1128 11:19:50.752514   20688 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:19:50.752572   20688 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 11:19:50.759191   20688 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:57253
	I1128 11:19:50.760063   20688 main.go:134] libmachine: () Calling .GetVersion
	I1128 11:19:50.760809   20688 main.go:134] libmachine: Using API Version  1
	I1128 11:19:50.760823   20688 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 11:19:50.761025   20688 main.go:134] libmachine: () Calling .GetMachineName
	I1128 11:19:50.761125   20688 main.go:134] libmachine: (multinode-110121) Calling .GetState
	I1128 11:19:50.761213   20688 main.go:134] libmachine: (multinode-110121) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:19:50.761285   20688 main.go:134] libmachine: (multinode-110121) DBG | hyperkit pid from json: 19648
	I1128 11:19:50.762110   20688 main.go:134] libmachine: (multinode-110121) DBG | hyperkit pid 19648 missing from process table
	I1128 11:19:50.762141   20688 status.go:330] multinode-110121 host status = "Stopped" (err=<nil>)
	I1128 11:19:50.762146   20688 status.go:343] host is not running, skipping remaining checks
	I1128 11:19:50.762157   20688 status.go:257] multinode-110121 status: &{Name:multinode-110121 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1128 11:19:50.762173   20688 status.go:255] checking status of multinode-110121-m02 ...
	I1128 11:19:50.762436   20688 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I1128 11:19:50.762457   20688 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I1128 11:19:50.769123   20688 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:57255
	I1128 11:19:50.769467   20688 main.go:134] libmachine: () Calling .GetVersion
	I1128 11:19:50.769799   20688 main.go:134] libmachine: Using API Version  1
	I1128 11:19:50.769816   20688 main.go:134] libmachine: () Calling .SetConfigRaw
	I1128 11:19:50.770044   20688 main.go:134] libmachine: () Calling .GetMachineName
	I1128 11:19:50.770149   20688 main.go:134] libmachine: (multinode-110121-m02) Calling .GetState
	I1128 11:19:50.770232   20688 main.go:134] libmachine: (multinode-110121-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I1128 11:19:50.770289   20688 main.go:134] libmachine: (multinode-110121-m02) DBG | hyperkit pid from json: 20030
	I1128 11:19:50.771103   20688 main.go:134] libmachine: (multinode-110121-m02) DBG | hyperkit pid 20030 missing from process table
	I1128 11:19:50.771125   20688 status.go:330] multinode-110121-m02 host status = "Stopped" (err=<nil>)
	I1128 11:19:50.771130   20688 status.go:343] host is not running, skipping remaining checks
	I1128 11:19:50.771135   20688 status.go:257] multinode-110121-m02 status: &{Name:multinode-110121-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopMultiNode (4.47s)

                                                
                                    
x
+
TestMultiNode/serial/RestartMultiNode (555.37s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartMultiNode
multinode_test.go:352: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-110121 --wait=true -v=8 --alsologtostderr --driver=hyperkit 
E1128 11:21:46.325532   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
E1128 11:23:20.506640   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
E1128 11:23:54.042961   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
E1128 11:26:23.567742   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
E1128 11:26:46.332339   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
E1128 11:28:20.513365   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
E1128 11:28:54.050284   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
multinode_test.go:352: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-110121 --wait=true -v=8 --alsologtostderr --driver=hyperkit : (9m15.034033032s)
multinode_test.go:358: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-110121 status --alsologtostderr
multinode_test.go:372: (dbg) Run:  kubectl get nodes
multinode_test.go:380: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/RestartMultiNode (555.37s)

                                                
                                    
x
+
TestMultiNode/serial/ValidateNameConflict (48.19s)

                                                
                                                
=== RUN   TestMultiNode/serial/ValidateNameConflict
multinode_test.go:441: (dbg) Run:  out/minikube-darwin-amd64 node list -p multinode-110121
multinode_test.go:450: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-110121-m02 --driver=hyperkit 
multinode_test.go:450: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p multinode-110121-m02 --driver=hyperkit : exit status 14 (336.9206ms)

                                                
                                                
-- stdout --
	* [multinode-110121-m02] minikube v1.28.0 on Darwin 13.0.1
	  - MINIKUBE_LOCATION=15411
	  - KUBECONFIG=/Users/jenkins/minikube-integration/15411-14646/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/15411-14646/.minikube
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! Profile name 'multinode-110121-m02' is duplicated with machine name 'multinode-110121-m02' in profile 'multinode-110121'
	X Exiting due to MK_USAGE: Profile name should be unique

                                                
                                                
** /stderr **
multinode_test.go:458: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-110121-m03 --driver=hyperkit 
multinode_test.go:458: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-110121-m03 --driver=hyperkit : (42.248525896s)
multinode_test.go:465: (dbg) Run:  out/minikube-darwin-amd64 node add -p multinode-110121
multinode_test.go:465: (dbg) Non-zero exit: out/minikube-darwin-amd64 node add -p multinode-110121: exit status 80 (276.122337ms)

                                                
                                                
-- stdout --
	* Adding node m03 to cluster multinode-110121
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to GUEST_NODE_ADD: Node multinode-110121-m03 already exists in multinode-110121-m03 profile
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                                                         │
	│    * If the above advice does not help, please let us know:                                                             │
	│      https://github.com/kubernetes/minikube/issues/new/choose                                                           │
	│                                                                                                                         │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.                                │
	│    * Please also attach the following file to the GitHub issue:                                                         │
	│    * - /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube_node_040ea7097fd6ed71e65be9a474587f81f0ccd21d_0.log    │
	│                                                                                                                         │
	╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
multinode_test.go:470: (dbg) Run:  out/minikube-darwin-amd64 delete -p multinode-110121-m03
multinode_test.go:470: (dbg) Done: out/minikube-darwin-amd64 delete -p multinode-110121-m03: (5.270729934s)
--- PASS: TestMultiNode/serial/ValidateNameConflict (48.19s)

                                                
                                    
x
+
TestPreload (147.54s)

                                                
                                                
=== RUN   TestPreload
preload_test.go:44: (dbg) Run:  out/minikube-darwin-amd64 start -p test-preload-112959 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.24.4
preload_test.go:44: (dbg) Done: out/minikube-darwin-amd64 start -p test-preload-112959 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.24.4: (1m17.656054346s)
preload_test.go:57: (dbg) Run:  out/minikube-darwin-amd64 ssh -p test-preload-112959 -- docker pull gcr.io/k8s-minikube/busybox
preload_test.go:57: (dbg) Done: out/minikube-darwin-amd64 ssh -p test-preload-112959 -- docker pull gcr.io/k8s-minikube/busybox: (5.77250554s)
preload_test.go:67: (dbg) Run:  out/minikube-darwin-amd64 start -p test-preload-112959 --memory=2200 --alsologtostderr -v=1 --wait=true --driver=hyperkit  --kubernetes-version=v1.24.6
E1128 11:31:46.338741   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
E1128 11:31:57.105210   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
preload_test.go:67: (dbg) Done: out/minikube-darwin-amd64 start -p test-preload-112959 --memory=2200 --alsologtostderr -v=1 --wait=true --driver=hyperkit  --kubernetes-version=v1.24.6: (58.595500421s)
preload_test.go:76: (dbg) Run:  out/minikube-darwin-amd64 ssh -p test-preload-112959 -- docker images
helpers_test.go:175: Cleaning up "test-preload-112959" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p test-preload-112959
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p test-preload-112959: (5.309444933s)
--- PASS: TestPreload (147.54s)

                                                
                                    
x
+
TestScheduledStopUnix (111.25s)

                                                
                                                
=== RUN   TestScheduledStopUnix
scheduled_stop_test.go:128: (dbg) Run:  out/minikube-darwin-amd64 start -p scheduled-stop-113226 --memory=2048 --driver=hyperkit 
scheduled_stop_test.go:128: (dbg) Done: out/minikube-darwin-amd64 start -p scheduled-stop-113226 --memory=2048 --driver=hyperkit : (39.779941804s)
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-113226 --schedule 5m
scheduled_stop_test.go:191: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.TimeToStop}} -p scheduled-stop-113226 -n scheduled-stop-113226
scheduled_stop_test.go:169: signal error was:  <nil>
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-113226 --schedule 15s
scheduled_stop_test.go:169: signal error was:  os: process already finished
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-113226 --cancel-scheduled
E1128 11:33:20.520320   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
scheduled_stop_test.go:176: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p scheduled-stop-113226 -n scheduled-stop-113226
scheduled_stop_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 status -p scheduled-stop-113226
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-113226 --schedule 15s
scheduled_stop_test.go:169: signal error was:  os: process already finished
E1128 11:33:54.055678   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
scheduled_stop_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 status -p scheduled-stop-113226
scheduled_stop_test.go:205: (dbg) Non-zero exit: out/minikube-darwin-amd64 status -p scheduled-stop-113226: exit status 7 (68.322694ms)

                                                
                                                
-- stdout --
	scheduled-stop-113226
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	

                                                
                                                
-- /stdout --
scheduled_stop_test.go:176: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p scheduled-stop-113226 -n scheduled-stop-113226
scheduled_stop_test.go:176: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p scheduled-stop-113226 -n scheduled-stop-113226: exit status 7 (64.997184ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
scheduled_stop_test.go:176: status error: exit status 7 (may be ok)
helpers_test.go:175: Cleaning up "scheduled-stop-113226" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p scheduled-stop-113226
--- PASS: TestScheduledStopUnix (111.25s)

                                                
                                    
x
+
TestSkaffold (79.1s)

                                                
                                                
=== RUN   TestSkaffold
skaffold_test.go:59: (dbg) Run:  /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/skaffold.exe19305626 version
skaffold_test.go:63: skaffold version: v2.0.2
skaffold_test.go:66: (dbg) Run:  out/minikube-darwin-amd64 start -p skaffold-113417 --memory=2600 --driver=hyperkit 
E1128 11:34:49.396693   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
skaffold_test.go:66: (dbg) Done: out/minikube-darwin-amd64 start -p skaffold-113417 --memory=2600 --driver=hyperkit : (39.77666378s)
skaffold_test.go:86: copying out/minikube-darwin-amd64 to /Users/jenkins/workspace/out/minikube
skaffold_test.go:105: (dbg) Run:  /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/skaffold.exe19305626 run --minikube-profile skaffold-113417 --kube-context skaffold-113417 --status-check=true --port-forward=false --interactive=false
skaffold_test.go:105: (dbg) Done: /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/skaffold.exe19305626 run --minikube-profile skaffold-113417 --kube-context skaffold-113417 --status-check=true --port-forward=false --interactive=false: (22.24285376s)
skaffold_test.go:111: (dbg) TestSkaffold: waiting 1m0s for pods matching "app=leeroy-app" in namespace "default" ...
helpers_test.go:342: "leeroy-app-5d5c74d9bc-l5pvt" [9babaf8c-9a3c-43a1-b66f-acd9410f9f8c] Running
skaffold_test.go:111: (dbg) TestSkaffold: app=leeroy-app healthy within 5.012978544s
skaffold_test.go:114: (dbg) TestSkaffold: waiting 1m0s for pods matching "app=leeroy-web" in namespace "default" ...
helpers_test.go:342: "leeroy-web-d46847b8b-nv7vl" [9471b486-1a52-4ab1-a469-6947adfca641] Running
skaffold_test.go:114: (dbg) TestSkaffold: app=leeroy-web healthy within 5.007508499s
helpers_test.go:175: Cleaning up "skaffold-113417" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p skaffold-113417
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p skaffold-113417: (5.275029773s)
--- PASS: TestSkaffold (79.10s)

                                                
                                    
x
+
TestRunningBinaryUpgrade (171.51s)

                                                
                                                
=== RUN   TestRunningBinaryUpgrade
=== PAUSE TestRunningBinaryUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestRunningBinaryUpgrade
version_upgrade_test.go:127: (dbg) Run:  /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube-v1.6.2.3481159359.exe start -p running-upgrade-114136 --memory=2200 --vm-driver=hyperkit 
E1128 11:41:43.688412   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/skaffold-113417/client.crt: no such file or directory
E1128 11:41:46.350196   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory

                                                
                                                
=== CONT  TestRunningBinaryUpgrade
version_upgrade_test.go:127: (dbg) Done: /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube-v1.6.2.3481159359.exe start -p running-upgrade-114136 --memory=2200 --vm-driver=hyperkit : (1m50.015884387s)
version_upgrade_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 start -p running-upgrade-114136 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit 
E1128 11:43:54.069061   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory

                                                
                                                
=== CONT  TestRunningBinaryUpgrade
version_upgrade_test.go:137: (dbg) Done: out/minikube-darwin-amd64 start -p running-upgrade-114136 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit : (55.518575555s)
helpers_test.go:175: Cleaning up "running-upgrade-114136" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p running-upgrade-114136
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p running-upgrade-114136: (5.284192468s)
--- PASS: TestRunningBinaryUpgrade (171.51s)

                                                
                                    
x
+
TestKubernetesUpgrade (167.34s)

                                                
                                                
=== RUN   TestKubernetesUpgrade
=== PAUSE TestKubernetesUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestKubernetesUpgrade
version_upgrade_test.go:229: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-113819 --memory=2200 --kubernetes-version=v1.16.0 --alsologtostderr -v=1 --driver=hyperkit 
E1128 11:38:20.525747   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
E1128 11:38:54.062123   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
version_upgrade_test.go:229: (dbg) Done: out/minikube-darwin-amd64 start -p kubernetes-upgrade-113819 --memory=2200 --kubernetes-version=v1.16.0 --alsologtostderr -v=1 --driver=hyperkit : (1m11.998832527s)
version_upgrade_test.go:234: (dbg) Run:  out/minikube-darwin-amd64 stop -p kubernetes-upgrade-113819
version_upgrade_test.go:234: (dbg) Done: out/minikube-darwin-amd64 stop -p kubernetes-upgrade-113819: (2.234101587s)
version_upgrade_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 -p kubernetes-upgrade-113819 status --format={{.Host}}
version_upgrade_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p kubernetes-upgrade-113819 status --format={{.Host}}: exit status 7 (65.532053ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
version_upgrade_test.go:241: status error: exit status 7 (may be ok)
version_upgrade_test.go:250: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-113819 --memory=2200 --kubernetes-version=v1.25.3 --alsologtostderr -v=1 --driver=hyperkit 
version_upgrade_test.go:250: (dbg) Done: out/minikube-darwin-amd64 start -p kubernetes-upgrade-113819 --memory=2200 --kubernetes-version=v1.25.3 --alsologtostderr -v=1 --driver=hyperkit : (39.952962098s)
version_upgrade_test.go:255: (dbg) Run:  kubectl --context kubernetes-upgrade-113819 version --output=json
version_upgrade_test.go:274: Attempting to downgrade Kubernetes (should fail)
version_upgrade_test.go:276: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-113819 --memory=2200 --kubernetes-version=v1.16.0 --driver=hyperkit 
version_upgrade_test.go:276: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p kubernetes-upgrade-113819 --memory=2200 --kubernetes-version=v1.16.0 --driver=hyperkit : exit status 106 (559.384646ms)

                                                
                                                
-- stdout --
	* [kubernetes-upgrade-113819] minikube v1.28.0 on Darwin 13.0.1
	  - MINIKUBE_LOCATION=15411
	  - KUBECONFIG=/Users/jenkins/minikube-integration/15411-14646/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/15411-14646/.minikube
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to K8S_DOWNGRADE_UNSUPPORTED: Unable to safely downgrade existing Kubernetes v1.25.3 cluster to v1.16.0
	* Suggestion: 
	
	    1) Recreate the cluster with Kubernetes 1.16.0, by running:
	    
	    minikube delete -p kubernetes-upgrade-113819
	    minikube start -p kubernetes-upgrade-113819 --kubernetes-version=v1.16.0
	    
	    2) Create a second cluster with Kubernetes 1.16.0, by running:
	    
	    minikube start -p kubernetes-upgrade-1138192 --kubernetes-version=v1.16.0
	    
	    3) Use the existing cluster at version Kubernetes 1.25.3, by running:
	    
	    minikube start -p kubernetes-upgrade-113819 --kubernetes-version=v1.25.3
	    

                                                
                                                
** /stderr **
version_upgrade_test.go:280: Attempting restart after unsuccessful downgrade
version_upgrade_test.go:282: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-113819 --memory=2200 --kubernetes-version=v1.25.3 --alsologtostderr -v=1 --driver=hyperkit 
E1128 11:40:21.745067   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/skaffold-113417/client.crt: no such file or directory
E1128 11:40:21.750333   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/skaffold-113417/client.crt: no such file or directory
E1128 11:40:21.760511   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/skaffold-113417/client.crt: no such file or directory
E1128 11:40:21.780656   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/skaffold-113417/client.crt: no such file or directory
E1128 11:40:21.820837   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/skaffold-113417/client.crt: no such file or directory
E1128 11:40:21.901670   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/skaffold-113417/client.crt: no such file or directory
E1128 11:40:22.061843   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/skaffold-113417/client.crt: no such file or directory
E1128 11:40:22.382102   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/skaffold-113417/client.crt: no such file or directory
E1128 11:40:23.023438   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/skaffold-113417/client.crt: no such file or directory
E1128 11:40:24.305084   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/skaffold-113417/client.crt: no such file or directory
E1128 11:40:26.865751   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/skaffold-113417/client.crt: no such file or directory
E1128 11:40:31.988072   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/skaffold-113417/client.crt: no such file or directory
E1128 11:40:42.228552   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/skaffold-113417/client.crt: no such file or directory
E1128 11:41:02.726361   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/skaffold-113417/client.crt: no such file or directory
version_upgrade_test.go:282: (dbg) Done: out/minikube-darwin-amd64 start -p kubernetes-upgrade-113819 --memory=2200 --kubernetes-version=v1.25.3 --alsologtostderr -v=1 --driver=hyperkit : (48.900636449s)
helpers_test.go:175: Cleaning up "kubernetes-upgrade-113819" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p kubernetes-upgrade-113819

                                                
                                                
=== CONT  TestKubernetesUpgrade
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p kubernetes-upgrade-113819: (3.583376918s)
--- PASS: TestKubernetesUpgrade (167.34s)

                                                
                                    
x
+
TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current (3.73s)

                                                
                                                
=== RUN   TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current
* minikube v1.28.0 on darwin
- MINIKUBE_LOCATION=15411
- KUBECONFIG=/Users/jenkins/minikube-integration/15411-14646/kubeconfig
- MINIKUBE_BIN=out/minikube-darwin-amd64
- MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
- MINIKUBE_HOME=/var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current111666153/001
* Using the hyperkit driver based on user configuration
* The 'hyperkit' driver requires elevated permissions. The following commands will be executed:

                                                
                                                
$ sudo chown root:wheel /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current111666153/001/.minikube/bin/docker-machine-driver-hyperkit 
$ sudo chmod u+s /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current111666153/001/.minikube/bin/docker-machine-driver-hyperkit 

                                                
                                                

                                                
                                                
! Unable to update hyperkit driver: [sudo chown root:wheel /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current111666153/001/.minikube/bin/docker-machine-driver-hyperkit] requires a password, and --interactive=false
* Starting control plane node minikube in cluster minikube
* Download complete!
--- PASS: TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current (3.73s)

                                                
                                    
x
+
TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current (6.7s)

                                                
                                                
=== RUN   TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current
* minikube v1.28.0 on darwin
- MINIKUBE_LOCATION=15411
- KUBECONFIG=/Users/jenkins/minikube-integration/15411-14646/kubeconfig
- MINIKUBE_BIN=out/minikube-darwin-amd64
- MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
- MINIKUBE_HOME=/var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current2801612532/001
* Using the hyperkit driver based on user configuration
* Downloading driver docker-machine-driver-hyperkit:
* The 'hyperkit' driver requires elevated permissions. The following commands will be executed:

                                                
                                                
$ sudo chown root:wheel /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current2801612532/001/.minikube/bin/docker-machine-driver-hyperkit 
$ sudo chmod u+s /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current2801612532/001/.minikube/bin/docker-machine-driver-hyperkit 

                                                
                                                

                                                
                                                
! Unable to update hyperkit driver: [sudo chown root:wheel /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current2801612532/001/.minikube/bin/docker-machine-driver-hyperkit] requires a password, and --interactive=false
* Starting control plane node minikube in cluster minikube
* Download complete!
--- PASS: TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current (6.70s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Setup (0.7s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Setup
--- PASS: TestStoppedBinaryUpgrade/Setup (0.70s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Upgrade (184.4s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Upgrade
version_upgrade_test.go:190: (dbg) Run:  /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube-v1.6.2.4174903856.exe start -p stopped-upgrade-114107 --memory=2200 --vm-driver=hyperkit 

                                                
                                                
=== CONT  TestStoppedBinaryUpgrade/Upgrade
version_upgrade_test.go:190: (dbg) Done: /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube-v1.6.2.4174903856.exe start -p stopped-upgrade-114107 --memory=2200 --vm-driver=hyperkit : (1m54.407994526s)
version_upgrade_test.go:199: (dbg) Run:  /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube-v1.6.2.4174903856.exe -p stopped-upgrade-114107 stop
E1128 11:43:03.590791   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
E1128 11:43:05.612404   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/skaffold-113417/client.crt: no such file or directory
version_upgrade_test.go:199: (dbg) Done: /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube-v1.6.2.4174903856.exe -p stopped-upgrade-114107 stop: (8.138183617s)
version_upgrade_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 start -p stopped-upgrade-114107 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit 
E1128 11:43:20.532572   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory

                                                
                                                
=== CONT  TestStoppedBinaryUpgrade/Upgrade
version_upgrade_test.go:205: (dbg) Done: out/minikube-darwin-amd64 start -p stopped-upgrade-114107 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit : (1m1.856978381s)
--- PASS: TestStoppedBinaryUpgrade/Upgrade (184.40s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/MinikubeLogs (3.08s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/MinikubeLogs
version_upgrade_test.go:213: (dbg) Run:  out/minikube-darwin-amd64 logs -p stopped-upgrade-114107
version_upgrade_test.go:213: (dbg) Done: out/minikube-darwin-amd64 logs -p stopped-upgrade-114107: (3.080157123s)
--- PASS: TestStoppedBinaryUpgrade/MinikubeLogs (3.08s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoK8sWithVersion (0.48s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoK8sWithVersion
no_kubernetes_test.go:83: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-114420 --no-kubernetes --kubernetes-version=1.20 --driver=hyperkit 
no_kubernetes_test.go:83: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p NoKubernetes-114420 --no-kubernetes --kubernetes-version=1.20 --driver=hyperkit : exit status 14 (478.306768ms)

                                                
                                                
-- stdout --
	* [NoKubernetes-114420] minikube v1.28.0 on Darwin 13.0.1
	  - MINIKUBE_LOCATION=15411
	  - KUBECONFIG=/Users/jenkins/minikube-integration/15411-14646/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/15411-14646/.minikube
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to MK_USAGE: cannot specify --kubernetes-version with --no-kubernetes,
	to unset a global config run:
	
	$ minikube config unset kubernetes-version

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/StartNoK8sWithVersion (0.48s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithK8s (41.95s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithK8s
no_kubernetes_test.go:95: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-114420 --driver=hyperkit 

                                                
                                                
=== CONT  TestNoKubernetes/serial/StartWithK8s
no_kubernetes_test.go:95: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-114420 --driver=hyperkit : (41.801653226s)
no_kubernetes_test.go:200: (dbg) Run:  out/minikube-darwin-amd64 -p NoKubernetes-114420 status -o json
--- PASS: TestNoKubernetes/serial/StartWithK8s (41.95s)

                                                
                                    
x
+
TestPause/serial/Start (61s)

                                                
                                                
=== RUN   TestPause/serial/Start
pause_test.go:80: (dbg) Run:  out/minikube-darwin-amd64 start -p pause-114428 --memory=2048 --install-addons=false --wait=all --driver=hyperkit 

                                                
                                                
=== CONT  TestPause/serial/Start
pause_test.go:80: (dbg) Done: out/minikube-darwin-amd64 start -p pause-114428 --memory=2048 --install-addons=false --wait=all --driver=hyperkit : (1m0.996523754s)
--- PASS: TestPause/serial/Start (61.00s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithStopK8s (16.59s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithStopK8s
no_kubernetes_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-114420 --no-kubernetes --driver=hyperkit 
no_kubernetes_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-114420 --no-kubernetes --driver=hyperkit : (14.122297881s)
no_kubernetes_test.go:200: (dbg) Run:  out/minikube-darwin-amd64 -p NoKubernetes-114420 status -o json
no_kubernetes_test.go:200: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p NoKubernetes-114420 status -o json: exit status 2 (139.255315ms)

                                                
                                                
-- stdout --
	{"Name":"NoKubernetes-114420","Host":"Running","Kubelet":"Stopped","APIServer":"Stopped","Kubeconfig":"Configured","Worker":false}

                                                
                                                
-- /stdout --
no_kubernetes_test.go:124: (dbg) Run:  out/minikube-darwin-amd64 delete -p NoKubernetes-114420
no_kubernetes_test.go:124: (dbg) Done: out/minikube-darwin-amd64 delete -p NoKubernetes-114420: (2.325236008s)
--- PASS: TestNoKubernetes/serial/StartWithStopK8s (16.59s)

                                                
                                    
x
+
TestNoKubernetes/serial/Start (14.86s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Start
no_kubernetes_test.go:136: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-114420 --no-kubernetes --driver=hyperkit 
E1128 11:45:21.753106   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/skaffold-113417/client.crt: no such file or directory

                                                
                                                
=== CONT  TestNoKubernetes/serial/Start
no_kubernetes_test.go:136: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-114420 --no-kubernetes --driver=hyperkit : (14.860505261s)
--- PASS: TestNoKubernetes/serial/Start (14.86s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunning (0.13s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunning
no_kubernetes_test.go:147: (dbg) Run:  out/minikube-darwin-amd64 ssh -p NoKubernetes-114420 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:147: (dbg) Non-zero exit: out/minikube-darwin-amd64 ssh -p NoKubernetes-114420 "sudo systemctl is-active --quiet service kubelet": exit status 1 (131.665255ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunning (0.13s)

                                                
                                    
x
+
TestNoKubernetes/serial/ProfileList (2.75s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/ProfileList
no_kubernetes_test.go:169: (dbg) Run:  out/minikube-darwin-amd64 profile list
no_kubernetes_test.go:169: (dbg) Done: out/minikube-darwin-amd64 profile list: (2.404727875s)
no_kubernetes_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 profile list --output=json
--- PASS: TestNoKubernetes/serial/ProfileList (2.75s)

                                                
                                    
x
+
TestNoKubernetes/serial/Stop (2.21s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Stop
no_kubernetes_test.go:158: (dbg) Run:  out/minikube-darwin-amd64 stop -p NoKubernetes-114420
no_kubernetes_test.go:158: (dbg) Done: out/minikube-darwin-amd64 stop -p NoKubernetes-114420: (2.20699224s)
--- PASS: TestNoKubernetes/serial/Stop (2.21s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoArgs (14.25s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoArgs
no_kubernetes_test.go:191: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-114420 --driver=hyperkit 
E1128 11:45:49.457107   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/skaffold-113417/client.crt: no such file or directory
no_kubernetes_test.go:191: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-114420 --driver=hyperkit : (14.246306782s)
--- PASS: TestNoKubernetes/serial/StartNoArgs (14.25s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.12s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunningSecond
no_kubernetes_test.go:147: (dbg) Run:  out/minikube-darwin-amd64 ssh -p NoKubernetes-114420 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:147: (dbg) Non-zero exit: out/minikube-darwin-amd64 ssh -p NoKubernetes-114420 "sudo systemctl is-active --quiet service kubelet": exit status 1 (121.166528ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/Start (58.28s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/Start
net_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p auto-113537 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --driver=hyperkit 

                                                
                                                
=== CONT  TestNetworkPlugins/group/auto/Start
net_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p auto-113537 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --driver=hyperkit : (58.284778504s)
--- PASS: TestNetworkPlugins/group/auto/Start (58.28s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/Start (63.29s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/Start
net_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p kindnet-113537 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=kindnet --driver=hyperkit 
E1128 11:46:46.357205   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory

                                                
                                                
=== CONT  TestNetworkPlugins/group/kindnet/Start
net_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p kindnet-113537 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=kindnet --driver=hyperkit : (1m3.287210172s)
--- PASS: TestNetworkPlugins/group/kindnet/Start (63.29s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/KubeletFlags (0.16s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/KubeletFlags
net_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 ssh -p auto-113537 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/auto/KubeletFlags (0.16s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/NetCatPod (14.18s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/NetCatPod
net_test.go:138: (dbg) Run:  kubectl --context auto-113537 replace --force -f testdata/netcat-deployment.yaml
net_test.go:152: (dbg) TestNetworkPlugins/group/auto/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-5788d667bd-kwb5z" [fe36f0f7-7778-4351-8126-30d48d160458] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:342: "netcat-5788d667bd-kwb5z" [fe36f0f7-7778-4351-8126-30d48d160458] Running
net_test.go:152: (dbg) TestNetworkPlugins/group/auto/NetCatPod: app=netcat healthy within 14.003870732s
--- PASS: TestNetworkPlugins/group/auto/NetCatPod (14.18s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/DNS (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/DNS
net_test.go:169: (dbg) Run:  kubectl --context auto-113537 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/auto/DNS (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/Localhost
net_test.go:188: (dbg) Run:  kubectl --context auto-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/auto/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/HairPin (5.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/HairPin
net_test.go:238: (dbg) Run:  kubectl --context auto-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
net_test.go:238: (dbg) Non-zero exit: kubectl --context auto-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080": exit status 1 (5.105434744s)

                                                
                                                
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
--- PASS: TestNetworkPlugins/group/auto/HairPin (5.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium/Start (97.39s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium/Start
net_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p cilium-113537 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=cilium --driver=hyperkit 

                                                
                                                
=== CONT  TestNetworkPlugins/group/cilium/Start
net_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p cilium-113537 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=cilium --driver=hyperkit : (1m37.389114987s)
--- PASS: TestNetworkPlugins/group/cilium/Start (97.39s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/ControllerPod (5.01s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/ControllerPod
net_test.go:109: (dbg) TestNetworkPlugins/group/kindnet/ControllerPod: waiting 10m0s for pods matching "app=kindnet" in namespace "kube-system" ...
helpers_test.go:342: "kindnet-flpvx" [0718c8fc-ca00-47d2-9f43-15757b906263] Running
net_test.go:109: (dbg) TestNetworkPlugins/group/kindnet/ControllerPod: app=kindnet healthy within 5.009227168s
--- PASS: TestNetworkPlugins/group/kindnet/ControllerPod (5.01s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/KubeletFlags
net_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 ssh -p kindnet-113537 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/kindnet/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/NetCatPod (14.23s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/NetCatPod
net_test.go:138: (dbg) Run:  kubectl --context kindnet-113537 replace --force -f testdata/netcat-deployment.yaml
net_test.go:152: (dbg) TestNetworkPlugins/group/kindnet/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-5788d667bd-7ld5n" [31a4fd48-0231-4b9d-8fe9-c46376dbf869] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:342: "netcat-5788d667bd-7ld5n" [31a4fd48-0231-4b9d-8fe9-c46376dbf869] Running
net_test.go:152: (dbg) TestNetworkPlugins/group/kindnet/NetCatPod: app=netcat healthy within 14.008739112s
--- PASS: TestNetworkPlugins/group/kindnet/NetCatPod (14.23s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/DNS (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/DNS
net_test.go:169: (dbg) Run:  kubectl --context kindnet-113537 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/kindnet/DNS (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/Localhost (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/Localhost
net_test.go:188: (dbg) Run:  kubectl --context kindnet-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/kindnet/Localhost (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/HairPin (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/HairPin
net_test.go:238: (dbg) Run:  kubectl --context kindnet-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/kindnet/HairPin (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/Start (311.05s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/Start
net_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p calico-113537 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=calico --driver=hyperkit 
E1128 11:48:20.539552   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
E1128 11:48:37.126827   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
E1128 11:48:54.074986   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory

                                                
                                                
=== CONT  TestNetworkPlugins/group/calico/Start
net_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p calico-113537 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=calico --driver=hyperkit : (5m11.053630192s)
--- PASS: TestNetworkPlugins/group/calico/Start (311.05s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium/ControllerPod (5.01s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium/ControllerPod
net_test.go:109: (dbg) TestNetworkPlugins/group/cilium/ControllerPod: waiting 10m0s for pods matching "k8s-app=cilium" in namespace "kube-system" ...
helpers_test.go:342: "cilium-2p2hj" [d2986cfd-ac22-49df-b8f8-a2a7f3548be1] Running
net_test.go:109: (dbg) TestNetworkPlugins/group/cilium/ControllerPod: k8s-app=cilium healthy within 5.013993291s
--- PASS: TestNetworkPlugins/group/cilium/ControllerPod (5.01s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium/KubeletFlags
net_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 ssh -p cilium-113537 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/cilium/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium/NetCatPod (15.65s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium/NetCatPod
net_test.go:138: (dbg) Run:  kubectl --context cilium-113537 replace --force -f testdata/netcat-deployment.yaml
net_test.go:152: (dbg) TestNetworkPlugins/group/cilium/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-5788d667bd-h7mlx" [0cd8bb05-2adc-4943-8409-0f260f1eaf22] Pending
helpers_test.go:342: "netcat-5788d667bd-h7mlx" [0cd8bb05-2adc-4943-8409-0f260f1eaf22] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:342: "netcat-5788d667bd-h7mlx" [0cd8bb05-2adc-4943-8409-0f260f1eaf22] Running
net_test.go:152: (dbg) TestNetworkPlugins/group/cilium/NetCatPod: app=netcat healthy within 15.008562863s
--- PASS: TestNetworkPlugins/group/cilium/NetCatPod (15.65s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium/DNS (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium/DNS
net_test.go:169: (dbg) Run:  kubectl --context cilium-113537 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/cilium/DNS (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium/Localhost
net_test.go:188: (dbg) Run:  kubectl --context cilium-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/cilium/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium/HairPin (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium/HairPin
net_test.go:238: (dbg) Run:  kubectl --context cilium-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/cilium/HairPin (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/Start (103.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/Start
net_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p custom-flannel-113537 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=testdata/kube-flannel.yaml --driver=hyperkit 
E1128 11:50:21.759178   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/skaffold-113417/client.crt: no such file or directory
net_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p custom-flannel-113537 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=testdata/kube-flannel.yaml --driver=hyperkit : (1m43.096952465s)
--- PASS: TestNetworkPlugins/group/custom-flannel/Start (103.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/KubeletFlags (0.16s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/KubeletFlags
net_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 ssh -p custom-flannel-113537 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/custom-flannel/KubeletFlags (0.16s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/NetCatPod (15.19s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/NetCatPod
net_test.go:138: (dbg) Run:  kubectl --context custom-flannel-113537 replace --force -f testdata/netcat-deployment.yaml
net_test.go:152: (dbg) TestNetworkPlugins/group/custom-flannel/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-5788d667bd-mx6lf" [c4942886-4bdd-4e6c-86c4-15770f4cc8b9] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:342: "netcat-5788d667bd-mx6lf" [c4942886-4bdd-4e6c-86c4-15770f4cc8b9] Running
net_test.go:152: (dbg) TestNetworkPlugins/group/custom-flannel/NetCatPod: app=netcat healthy within 15.006141589s
--- PASS: TestNetworkPlugins/group/custom-flannel/NetCatPod (15.19s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/DNS (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/DNS
net_test.go:169: (dbg) Run:  kubectl --context custom-flannel-113537 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/custom-flannel/DNS (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/Localhost (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/Localhost
net_test.go:188: (dbg) Run:  kubectl --context custom-flannel-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/custom-flannel/Localhost (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/HairPin (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/HairPin
net_test.go:238: (dbg) Run:  kubectl --context custom-flannel-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/custom-flannel/HairPin (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/Start (54.65s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/Start
net_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p false-113537 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=false --driver=hyperkit 
E1128 11:51:29.418521   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
E1128 11:51:46.363108   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
E1128 11:51:54.887992   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/auto-113537/client.crt: no such file or directory
E1128 11:51:54.893302   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/auto-113537/client.crt: no such file or directory
E1128 11:51:54.905258   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/auto-113537/client.crt: no such file or directory
E1128 11:51:54.926983   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/auto-113537/client.crt: no such file or directory
E1128 11:51:54.967160   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/auto-113537/client.crt: no such file or directory
E1128 11:51:55.048099   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/auto-113537/client.crt: no such file or directory
E1128 11:51:55.209694   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/auto-113537/client.crt: no such file or directory
E1128 11:51:55.530670   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/auto-113537/client.crt: no such file or directory
E1128 11:51:56.170877   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/auto-113537/client.crt: no such file or directory
E1128 11:51:57.451481   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/auto-113537/client.crt: no such file or directory
E1128 11:52:00.013109   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/auto-113537/client.crt: no such file or directory
E1128 11:52:05.134182   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/auto-113537/client.crt: no such file or directory
E1128 11:52:15.374628   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/auto-113537/client.crt: no such file or directory
net_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p false-113537 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=false --driver=hyperkit : (54.645509799s)
--- PASS: TestNetworkPlugins/group/false/Start (54.65s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/KubeletFlags (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/KubeletFlags
net_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 ssh -p false-113537 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/false/KubeletFlags (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/NetCatPod (14.19s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/NetCatPod
net_test.go:138: (dbg) Run:  kubectl --context false-113537 replace --force -f testdata/netcat-deployment.yaml
net_test.go:152: (dbg) TestNetworkPlugins/group/false/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-5788d667bd-6qj5h" [cf68ed8a-6de2-4172-b8d4-364e1e952907] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
E1128 11:52:25.786541   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kindnet-113537/client.crt: no such file or directory
E1128 11:52:25.791633   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kindnet-113537/client.crt: no such file or directory
E1128 11:52:25.801768   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kindnet-113537/client.crt: no such file or directory
E1128 11:52:25.822912   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kindnet-113537/client.crt: no such file or directory
E1128 11:52:25.863790   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kindnet-113537/client.crt: no such file or directory
E1128 11:52:25.944619   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kindnet-113537/client.crt: no such file or directory
E1128 11:52:26.105151   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kindnet-113537/client.crt: no such file or directory
E1128 11:52:26.426386   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kindnet-113537/client.crt: no such file or directory
E1128 11:52:27.066536   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kindnet-113537/client.crt: no such file or directory
E1128 11:52:28.346777   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kindnet-113537/client.crt: no such file or directory
helpers_test.go:342: "netcat-5788d667bd-6qj5h" [cf68ed8a-6de2-4172-b8d4-364e1e952907] Running
E1128 11:52:30.907246   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kindnet-113537/client.crt: no such file or directory
net_test.go:152: (dbg) TestNetworkPlugins/group/false/NetCatPod: app=netcat healthy within 14.00634004s
--- PASS: TestNetworkPlugins/group/false/NetCatPod (14.19s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/DNS (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/DNS
net_test.go:169: (dbg) Run:  kubectl --context false-113537 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/false/DNS (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/Localhost
net_test.go:188: (dbg) Run:  kubectl --context false-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/false/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/HairPin (5.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/HairPin
net_test.go:238: (dbg) Run:  kubectl --context false-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
E1128 11:52:35.855330   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/auto-113537/client.crt: no such file or directory
E1128 11:52:36.027707   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kindnet-113537/client.crt: no such file or directory
net_test.go:238: (dbg) Non-zero exit: kubectl --context false-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080": exit status 1 (5.105040119s)

                                                
                                                
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
--- PASS: TestNetworkPlugins/group/false/HairPin (5.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/Start (54.48s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/Start
net_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p enable-default-cni-113537 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --enable-default-cni=true --driver=hyperkit 
E1128 11:52:46.268623   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kindnet-113537/client.crt: no such file or directory

                                                
                                                
=== CONT  TestNetworkPlugins/group/enable-default-cni/Start
net_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p enable-default-cni-113537 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --enable-default-cni=true --driver=hyperkit : (54.483064992s)
--- PASS: TestNetworkPlugins/group/enable-default-cni/Start (54.48s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/ControllerPod (5.01s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/ControllerPod
net_test.go:109: (dbg) TestNetworkPlugins/group/calico/ControllerPod: waiting 10m0s for pods matching "k8s-app=calico-node" in namespace "kube-system" ...
helpers_test.go:342: "calico-node-lsfkn" [87366332-b707-442c-92c4-d04abe7b2037] Running / Ready:ContainersNotReady (containers with unready status: [calico-node]) / ContainersReady:ContainersNotReady (containers with unready status: [calico-node])
E1128 11:53:06.749674   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kindnet-113537/client.crt: no such file or directory
net_test.go:109: (dbg) TestNetworkPlugins/group/calico/ControllerPod: k8s-app=calico-node healthy within 5.012368563s
--- PASS: TestNetworkPlugins/group/calico/ControllerPod (5.01s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/KubeletFlags
net_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 ssh -p calico-113537 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/calico/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/NetCatPod (15.31s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/NetCatPod
net_test.go:138: (dbg) Run:  kubectl --context calico-113537 replace --force -f testdata/netcat-deployment.yaml
net_test.go:152: (dbg) TestNetworkPlugins/group/calico/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-5788d667bd-tfmt6" [e59d5c04-e9ef-43aa-b8dd-1e0e3e726f2f] Pending
helpers_test.go:342: "netcat-5788d667bd-tfmt6" [e59d5c04-e9ef-43aa-b8dd-1e0e3e726f2f] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:342: "netcat-5788d667bd-tfmt6" [e59d5c04-e9ef-43aa-b8dd-1e0e3e726f2f] Running
E1128 11:53:16.817680   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/auto-113537/client.crt: no such file or directory
E1128 11:53:20.545199   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
net_test.go:152: (dbg) TestNetworkPlugins/group/calico/NetCatPod: app=netcat healthy within 15.009898714s
--- PASS: TestNetworkPlugins/group/calico/NetCatPod (15.31s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/DNS (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/DNS
net_test.go:169: (dbg) Run:  kubectl --context calico-113537 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/calico/DNS (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/Localhost
net_test.go:188: (dbg) Run:  kubectl --context calico-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/calico/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/HairPin (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/HairPin
net_test.go:238: (dbg) Run:  kubectl --context calico-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/calico/HairPin (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/Start (58.38s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/Start
net_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p flannel-113537 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=flannel --driver=hyperkit 

                                                
                                                
=== CONT  TestNetworkPlugins/group/flannel/Start
net_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p flannel-113537 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=flannel --driver=hyperkit : (58.377994984s)
--- PASS: TestNetworkPlugins/group/flannel/Start (58.38s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/KubeletFlags
net_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 ssh -p enable-default-cni-113537 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/enable-default-cni/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/NetCatPod (15.19s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/NetCatPod
net_test.go:138: (dbg) Run:  kubectl --context enable-default-cni-113537 replace --force -f testdata/netcat-deployment.yaml
net_test.go:152: (dbg) TestNetworkPlugins/group/enable-default-cni/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-5788d667bd-bnlbq" [f8c7616a-b0d1-494f-a9df-fc030cf145a3] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
E1128 11:53:47.710807   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kindnet-113537/client.crt: no such file or directory
helpers_test.go:342: "netcat-5788d667bd-bnlbq" [f8c7616a-b0d1-494f-a9df-fc030cf145a3] Running
E1128 11:53:54.081386   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
net_test.go:152: (dbg) TestNetworkPlugins/group/enable-default-cni/NetCatPod: app=netcat healthy within 15.007796465s
--- PASS: TestNetworkPlugins/group/enable-default-cni/NetCatPod (15.19s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/DNS (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/DNS
net_test.go:169: (dbg) Run:  kubectl --context enable-default-cni-113537 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/enable-default-cni/DNS (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/Localhost
net_test.go:188: (dbg) Run:  kubectl --context enable-default-cni-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/enable-default-cni/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/HairPin (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/HairPin
net_test.go:238: (dbg) Run:  kubectl --context enable-default-cni-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/enable-default-cni/HairPin (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/Start (59.47s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/Start
net_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p bridge-113537 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=bridge --driver=hyperkit 
E1128 11:53:59.553962   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/cilium-113537/client.crt: no such file or directory
E1128 11:54:02.114256   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/cilium-113537/client.crt: no such file or directory
E1128 11:54:07.235241   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/cilium-113537/client.crt: no such file or directory
E1128 11:54:17.571058   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/cilium-113537/client.crt: no such file or directory

                                                
                                                
=== CONT  TestNetworkPlugins/group/bridge/Start
net_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p bridge-113537 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=bridge --driver=hyperkit : (59.466049784s)
--- PASS: TestNetworkPlugins/group/bridge/Start (59.47s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/ControllerPod (8.01s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/ControllerPod
net_test.go:109: (dbg) TestNetworkPlugins/group/flannel/ControllerPod: waiting 10m0s for pods matching "app=flannel" in namespace "kube-system" ...
helpers_test.go:342: "kube-flannel-ds-amd64-slfj2" [0d5a3d4d-1bb9-4856-afc4-75c5dfd43b26] Pending: Initialized:ContainersNotInitialized (containers with incomplete status: [install-cni]) / Ready:ContainersNotReady (containers with unready status: [kube-flannel]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-flannel])
helpers_test.go:342: "kube-flannel-ds-amd64-slfj2" [0d5a3d4d-1bb9-4856-afc4-75c5dfd43b26] Pending / Ready:ContainersNotReady (containers with unready status: [kube-flannel]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-flannel])
helpers_test.go:342: "kube-flannel-ds-amd64-slfj2" [0d5a3d4d-1bb9-4856-afc4-75c5dfd43b26] Running
net_test.go:109: (dbg) TestNetworkPlugins/group/flannel/ControllerPod: app=flannel healthy within 8.00998804s
--- PASS: TestNetworkPlugins/group/flannel/ControllerPod (8.01s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/KubeletFlags (0.16s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/KubeletFlags
net_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 ssh -p flannel-113537 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/flannel/KubeletFlags (0.16s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/NetCatPod (15.22s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/NetCatPod
net_test.go:138: (dbg) Run:  kubectl --context flannel-113537 replace --force -f testdata/netcat-deployment.yaml
net_test.go:152: (dbg) TestNetworkPlugins/group/flannel/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-5788d667bd-lrvvt" [41ab54d6-2f53-4918-ab8f-978c26e5afeb] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
E1128 11:54:38.052809   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/cilium-113537/client.crt: no such file or directory
E1128 11:54:38.833480   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/auto-113537/client.crt: no such file or directory
helpers_test.go:342: "netcat-5788d667bd-lrvvt" [41ab54d6-2f53-4918-ab8f-978c26e5afeb] Running
net_test.go:152: (dbg) TestNetworkPlugins/group/flannel/NetCatPod: app=netcat healthy within 15.006516675s
--- PASS: TestNetworkPlugins/group/flannel/NetCatPod (15.22s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/DNS (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/DNS
net_test.go:169: (dbg) Run:  kubectl --context flannel-113537 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/flannel/DNS (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/Localhost
net_test.go:188: (dbg) Run:  kubectl --context flannel-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/flannel/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/HairPin (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/HairPin
net_test.go:238: (dbg) Run:  kubectl --context flannel-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/flannel/HairPin (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/Start (56.75s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/Start
net_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p kubenet-113537 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --network-plugin=kubenet --driver=hyperkit 

                                                
                                                
=== CONT  TestNetworkPlugins/group/kubenet/Start
net_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p kubenet-113537 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --network-plugin=kubenet --driver=hyperkit : (56.746538422s)
--- PASS: TestNetworkPlugins/group/kubenet/Start (56.75s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/KubeletFlags
net_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 ssh -p bridge-113537 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/bridge/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/NetCatPod (14.21s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/NetCatPod
net_test.go:138: (dbg) Run:  kubectl --context bridge-113537 replace --force -f testdata/netcat-deployment.yaml
net_test.go:152: (dbg) TestNetworkPlugins/group/bridge/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-5788d667bd-t5d2p" [ddaeeaec-7e77-433d-86f3-aded61228ac2] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:342: "netcat-5788d667bd-t5d2p" [ddaeeaec-7e77-433d-86f3-aded61228ac2] Running
E1128 11:55:09.726215   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kindnet-113537/client.crt: no such file or directory
net_test.go:152: (dbg) TestNetworkPlugins/group/bridge/NetCatPod: app=netcat healthy within 14.005873332s
--- PASS: TestNetworkPlugins/group/bridge/NetCatPod (14.21s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/DNS (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/DNS
net_test.go:169: (dbg) Run:  kubectl --context bridge-113537 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/bridge/DNS (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/Localhost (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/Localhost
net_test.go:188: (dbg) Run:  kubectl --context bridge-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/bridge/Localhost (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/HairPin (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/HairPin
net_test.go:238: (dbg) Run:  kubectl --context bridge-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/bridge/HairPin (0.10s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/FirstStart (345.77s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p old-k8s-version-115518 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=hyperkit  --kubernetes-version=v1.16.0
E1128 11:55:19.015862   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/cilium-113537/client.crt: no such file or directory
E1128 11:55:21.858496   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/skaffold-113417/client.crt: no such file or directory

                                                
                                                
=== CONT  TestStartStop/group/old-k8s-version/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Done: out/minikube-darwin-amd64 start -p old-k8s-version-115518 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=hyperkit  --kubernetes-version=v1.16.0: (5m45.767828813s)
--- PASS: TestStartStop/group/old-k8s-version/serial/FirstStart (345.77s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/KubeletFlags
net_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 ssh -p kubenet-113537 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/kubenet/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/NetCatPod (14.19s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/NetCatPod
net_test.go:138: (dbg) Run:  kubectl --context kubenet-113537 replace --force -f testdata/netcat-deployment.yaml
net_test.go:152: (dbg) TestNetworkPlugins/group/kubenet/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-5788d667bd-2jpz4" [f5045f86-c7f8-459b-8a14-ad8b56c1ef10] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:342: "netcat-5788d667bd-2jpz4" [f5045f86-c7f8-459b-8a14-ad8b56c1ef10] Running
net_test.go:152: (dbg) TestNetworkPlugins/group/kubenet/NetCatPod: app=netcat healthy within 14.007925854s
--- PASS: TestNetworkPlugins/group/kubenet/NetCatPod (14.19s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/DNS (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/DNS
net_test.go:169: (dbg) Run:  kubectl --context kubenet-113537 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/kubenet/DNS (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/Localhost
net_test.go:188: (dbg) Run:  kubectl --context kubenet-113537 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/kubenet/Localhost (0.10s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/FirstStart (77.6s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p no-preload-115704 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.25.3
E1128 11:57:20.592685   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/false-113537/client.crt: no such file or directory
E1128 11:57:20.598256   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/false-113537/client.crt: no such file or directory
E1128 11:57:20.608359   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/false-113537/client.crt: no such file or directory
E1128 11:57:20.628511   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/false-113537/client.crt: no such file or directory
E1128 11:57:20.670274   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/false-113537/client.crt: no such file or directory
E1128 11:57:20.750934   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/false-113537/client.crt: no such file or directory
E1128 11:57:20.911081   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/false-113537/client.crt: no such file or directory
E1128 11:57:21.231970   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/false-113537/client.crt: no such file or directory
E1128 11:57:21.872598   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/false-113537/client.crt: no such file or directory
E1128 11:57:22.677510   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/auto-113537/client.crt: no such file or directory
E1128 11:57:23.153034   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/false-113537/client.crt: no such file or directory
E1128 11:57:25.714390   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/false-113537/client.crt: no such file or directory
E1128 11:57:25.887093   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kindnet-113537/client.crt: no such file or directory
E1128 11:57:28.823228   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/custom-flannel-113537/client.crt: no such file or directory
E1128 11:57:30.834816   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/false-113537/client.crt: no such file or directory
E1128 11:57:41.075637   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/false-113537/client.crt: no such file or directory
E1128 11:57:53.570142   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kindnet-113537/client.crt: no such file or directory
E1128 11:58:01.557231   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/false-113537/client.crt: no such file or directory
E1128 11:58:02.036036   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/calico-113537/client.crt: no such file or directory
E1128 11:58:02.042390   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/calico-113537/client.crt: no such file or directory
E1128 11:58:02.054545   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/calico-113537/client.crt: no such file or directory
E1128 11:58:02.075495   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/calico-113537/client.crt: no such file or directory
E1128 11:58:02.115699   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/calico-113537/client.crt: no such file or directory
E1128 11:58:02.196862   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/calico-113537/client.crt: no such file or directory
E1128 11:58:02.357781   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/calico-113537/client.crt: no such file or directory
E1128 11:58:02.724079   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/calico-113537/client.crt: no such file or directory
E1128 11:58:03.365651   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/calico-113537/client.crt: no such file or directory
E1128 11:58:04.646966   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/calico-113537/client.crt: no such file or directory
E1128 11:58:07.207289   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/calico-113537/client.crt: no such file or directory
E1128 11:58:12.328774   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/calico-113537/client.crt: no such file or directory
E1128 11:58:20.646261   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
start_stop_delete_test.go:186: (dbg) Done: out/minikube-darwin-amd64 start -p no-preload-115704 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.25.3: (1m17.603676395s)
--- PASS: TestStartStop/group/no-preload/serial/FirstStart (77.60s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/DeployApp (13.27s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context no-preload-115704 create -f testdata/busybox.yaml
E1128 11:58:22.569334   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/calico-113537/client.crt: no such file or directory
start_stop_delete_test.go:196: (dbg) TestStartStop/group/no-preload/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:342: "busybox" [d9ab1ee2-ef69-4254-a3d7-54ce755b61be] Pending
helpers_test.go:342: "busybox" [d9ab1ee2-ef69-4254-a3d7-54ce755b61be] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:342: "busybox" [d9ab1ee2-ef69-4254-a3d7-54ce755b61be] Running
start_stop_delete_test.go:196: (dbg) TestStartStop/group/no-preload/serial/DeployApp: integration-test=busybox healthy within 13.01645454s
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context no-preload-115704 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/no-preload/serial/DeployApp (13.27s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/EnableAddonWhileActive (0.68s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p no-preload-115704 --images=MetricsServer=k8s.gcr.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context no-preload-115704 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/no-preload/serial/EnableAddonWhileActive (0.68s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/Stop (8.26s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 stop -p no-preload-115704 --alsologtostderr -v=3
E1128 11:58:40.022970   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/enable-default-cni-113537/client.crt: no such file or directory
E1128 11:58:40.029400   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/enable-default-cni-113537/client.crt: no such file or directory
E1128 11:58:40.040385   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/enable-default-cni-113537/client.crt: no such file or directory
E1128 11:58:40.062683   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/enable-default-cni-113537/client.crt: no such file or directory
E1128 11:58:40.103198   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/enable-default-cni-113537/client.crt: no such file or directory
E1128 11:58:40.183485   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/enable-default-cni-113537/client.crt: no such file or directory
E1128 11:58:40.344645   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/enable-default-cni-113537/client.crt: no such file or directory
E1128 11:58:40.665299   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/enable-default-cni-113537/client.crt: no such file or directory
E1128 11:58:41.306372   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/enable-default-cni-113537/client.crt: no such file or directory
E1128 11:58:42.520334   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/false-113537/client.crt: no such file or directory
E1128 11:58:42.586702   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/enable-default-cni-113537/client.crt: no such file or directory
E1128 11:58:43.050288   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/calico-113537/client.crt: no such file or directory
start_stop_delete_test.go:228: (dbg) Done: out/minikube-darwin-amd64 stop -p no-preload-115704 --alsologtostderr -v=3: (8.258354522s)
--- PASS: TestStartStop/group/no-preload/serial/Stop (8.26s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/EnableAddonAfterStop (0.3s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-115704 -n no-preload-115704
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-115704 -n no-preload-115704: exit status 7 (65.404271ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p no-preload-115704 --images=MetricsScraper=k8s.gcr.io/echoserver:1.4
--- PASS: TestStartStop/group/no-preload/serial/EnableAddonAfterStop (0.30s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/SecondStart (315.91s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p no-preload-115704 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.25.3
E1128 11:58:45.147501   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/enable-default-cni-113537/client.crt: no such file or directory
E1128 11:58:50.268247   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/enable-default-cni-113537/client.crt: no such file or directory
E1128 11:58:50.746602   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/custom-flannel-113537/client.crt: no such file or directory
E1128 11:58:54.181261   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
E1128 11:58:57.036564   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/cilium-113537/client.crt: no such file or directory
E1128 11:59:00.509081   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/enable-default-cni-113537/client.crt: no such file or directory
E1128 11:59:20.989981   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/enable-default-cni-113537/client.crt: no such file or directory
E1128 11:59:24.011976   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/calico-113537/client.crt: no such file or directory
E1128 11:59:24.784146   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/cilium-113537/client.crt: no such file or directory
E1128 11:59:26.526313   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/flannel-113537/client.crt: no such file or directory
E1128 11:59:26.532071   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/flannel-113537/client.crt: no such file or directory
E1128 11:59:26.542263   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/flannel-113537/client.crt: no such file or directory
E1128 11:59:26.563351   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/flannel-113537/client.crt: no such file or directory
E1128 11:59:26.604090   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/flannel-113537/client.crt: no such file or directory
E1128 11:59:26.684455   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/flannel-113537/client.crt: no such file or directory
E1128 11:59:26.846663   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/flannel-113537/client.crt: no such file or directory
E1128 11:59:27.166806   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/flannel-113537/client.crt: no such file or directory
E1128 11:59:27.807133   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/flannel-113537/client.crt: no such file or directory
E1128 11:59:29.087975   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/flannel-113537/client.crt: no such file or directory
E1128 11:59:31.648510   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/flannel-113537/client.crt: no such file or directory
E1128 11:59:36.769907   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/flannel-113537/client.crt: no such file or directory
E1128 11:59:43.706345   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
E1128 11:59:47.010716   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/flannel-113537/client.crt: no such file or directory
E1128 11:59:58.625766   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/bridge-113537/client.crt: no such file or directory
E1128 11:59:58.630945   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/bridge-113537/client.crt: no such file or directory
E1128 11:59:58.642046   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/bridge-113537/client.crt: no such file or directory
E1128 11:59:58.662905   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/bridge-113537/client.crt: no such file or directory
E1128 11:59:58.704957   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/bridge-113537/client.crt: no such file or directory
E1128 11:59:58.786937   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/bridge-113537/client.crt: no such file or directory
E1128 11:59:58.947930   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/bridge-113537/client.crt: no such file or directory
E1128 11:59:59.268226   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/bridge-113537/client.crt: no such file or directory
E1128 11:59:59.909237   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/bridge-113537/client.crt: no such file or directory
E1128 12:00:01.190352   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/bridge-113537/client.crt: no such file or directory
E1128 12:00:01.951327   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/enable-default-cni-113537/client.crt: no such file or directory
E1128 12:00:03.750721   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/bridge-113537/client.crt: no such file or directory
E1128 12:00:04.444360   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/false-113537/client.crt: no such file or directory
E1128 12:00:07.492389   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/flannel-113537/client.crt: no such file or directory
E1128 12:00:08.871057   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/bridge-113537/client.crt: no such file or directory
E1128 12:00:19.112598   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/bridge-113537/client.crt: no such file or directory
E1128 12:00:21.865559   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/skaffold-113417/client.crt: no such file or directory
E1128 12:00:39.593905   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/bridge-113537/client.crt: no such file or directory
E1128 12:00:45.934148   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/calico-113537/client.crt: no such file or directory
E1128 12:00:48.453607   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/flannel-113537/client.crt: no such file or directory
E1128 12:00:50.735177   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kubenet-113537/client.crt: no such file or directory
E1128 12:00:50.741194   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kubenet-113537/client.crt: no such file or directory
E1128 12:00:50.751718   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kubenet-113537/client.crt: no such file or directory
E1128 12:00:50.771920   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kubenet-113537/client.crt: no such file or directory
E1128 12:00:50.812787   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kubenet-113537/client.crt: no such file or directory
E1128 12:00:50.893679   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kubenet-113537/client.crt: no such file or directory
E1128 12:00:51.055002   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kubenet-113537/client.crt: no such file or directory
E1128 12:00:51.375570   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kubenet-113537/client.crt: no such file or directory
E1128 12:00:52.016457   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kubenet-113537/client.crt: no such file or directory
E1128 12:00:53.297698   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kubenet-113537/client.crt: no such file or directory
E1128 12:00:55.858878   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kubenet-113537/client.crt: no such file or directory
E1128 12:01:00.981318   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kubenet-113537/client.crt: no such file or directory

                                                
                                                
=== CONT  TestStartStop/group/no-preload/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Done: out/minikube-darwin-amd64 start -p no-preload-115704 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.25.3: (5m15.750979798s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-115704 -n no-preload-115704
--- PASS: TestStartStop/group/no-preload/serial/SecondStart (315.91s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/DeployApp (12.29s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context old-k8s-version-115518 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) TestStartStop/group/old-k8s-version/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:342: "busybox" [e06201ba-d124-43c5-b213-1acf2abd82ba] Pending
helpers_test.go:342: "busybox" [e06201ba-d124-43c5-b213-1acf2abd82ba] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
E1128 12:01:06.874896   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/custom-flannel-113537/client.crt: no such file or directory
E1128 12:01:11.222310   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kubenet-113537/client.crt: no such file or directory
helpers_test.go:342: "busybox" [e06201ba-d124-43c5-b213-1acf2abd82ba] Running
start_stop_delete_test.go:196: (dbg) TestStartStop/group/old-k8s-version/serial/DeployApp: integration-test=busybox healthy within 12.013295476s
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context old-k8s-version-115518 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/old-k8s-version/serial/DeployApp (12.29s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive (0.64s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p old-k8s-version-115518 --images=MetricsServer=k8s.gcr.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context old-k8s-version-115518 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive (0.64s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/Stop (2.24s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 stop -p old-k8s-version-115518 --alsologtostderr -v=3
start_stop_delete_test.go:228: (dbg) Done: out/minikube-darwin-amd64 stop -p old-k8s-version-115518 --alsologtostderr -v=3: (2.237128757s)
--- PASS: TestStartStop/group/old-k8s-version/serial/Stop (2.24s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop (0.3s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p old-k8s-version-115518 -n old-k8s-version-115518
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p old-k8s-version-115518 -n old-k8s-version-115518: exit status 7 (64.523816ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p old-k8s-version-115518 --images=MetricsScraper=k8s.gcr.io/echoserver:1.4
--- PASS: TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop (0.30s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/SecondStart (453.79s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p old-k8s-version-115518 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=hyperkit  --kubernetes-version=v1.16.0
E1128 12:01:20.554953   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/bridge-113537/client.crt: no such file or directory
E1128 12:01:23.873480   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/enable-default-cni-113537/client.crt: no such file or directory
E1128 12:01:31.703485   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kubenet-113537/client.crt: no such file or directory
E1128 12:01:34.590537   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/custom-flannel-113537/client.crt: no such file or directory
E1128 12:01:46.469239   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
E1128 12:01:54.994075   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/auto-113537/client.crt: no such file or directory
E1128 12:02:10.376106   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/flannel-113537/client.crt: no such file or directory
E1128 12:02:12.664909   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kubenet-113537/client.crt: no such file or directory
E1128 12:02:20.600542   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/false-113537/client.crt: no such file or directory
E1128 12:02:25.893035   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kindnet-113537/client.crt: no such file or directory
E1128 12:02:42.479006   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/bridge-113537/client.crt: no such file or directory
E1128 12:02:48.288049   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/false-113537/client.crt: no such file or directory
E1128 12:03:02.040787   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/calico-113537/client.crt: no such file or directory
E1128 12:03:20.651320   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
E1128 12:03:29.779559   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/calico-113537/client.crt: no such file or directory
E1128 12:03:34.587063   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kubenet-113537/client.crt: no such file or directory
E1128 12:03:40.030080   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/enable-default-cni-113537/client.crt: no such file or directory
E1128 12:03:54.187522   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
E1128 12:03:57.042381   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/cilium-113537/client.crt: no such file or directory

                                                
                                                
=== CONT  TestStartStop/group/old-k8s-version/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Done: out/minikube-darwin-amd64 start -p old-k8s-version-115518 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=hyperkit  --kubernetes-version=v1.16.0: (7m33.624774047s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p old-k8s-version-115518 -n old-k8s-version-115518
--- PASS: TestStartStop/group/old-k8s-version/serial/SecondStart (453.79s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/UserAppExistsAfterStop (19.01s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-f87d45d87-xc9fp" [34a160de-2fa4-424d-9029-758767c0f29a] Pending / Ready:ContainersNotReady (containers with unready status: [kubernetes-dashboard]) / ContainersReady:ContainersNotReady (containers with unready status: [kubernetes-dashboard])
E1128 12:04:07.717100   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/enable-default-cni-113537/client.crt: no such file or directory
helpers_test.go:342: "kubernetes-dashboard-f87d45d87-xc9fp" [34a160de-2fa4-424d-9029-758767c0f29a] Running
start_stop_delete_test.go:274: (dbg) TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 19.012815882s
--- PASS: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop (19.01s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/AddonExistsAfterStop (5.06s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/no-preload/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-f87d45d87-xc9fp" [34a160de-2fa4-424d-9029-758767c0f29a] Running
start_stop_delete_test.go:287: (dbg) TestStartStop/group/no-preload/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.005877111s
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context no-preload-115704 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/no-preload/serial/AddonExistsAfterStop (5.06s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/VerifyKubernetesImages (0.17s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-darwin-amd64 ssh -p no-preload-115704 "sudo crictl images -o json"
start_stop_delete_test.go:304: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
--- PASS: TestStartStop/group/no-preload/serial/VerifyKubernetesImages (0.17s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/Pause (1.89s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 pause -p no-preload-115704 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p no-preload-115704 -n no-preload-115704
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p no-preload-115704 -n no-preload-115704: exit status 2 (158.44187ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p no-preload-115704 -n no-preload-115704
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p no-preload-115704 -n no-preload-115704: exit status 2 (153.068684ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 unpause -p no-preload-115704 --alsologtostderr -v=1
E1128 12:04:26.532406   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/flannel-113537/client.crt: no such file or directory
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p no-preload-115704 -n no-preload-115704
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p no-preload-115704 -n no-preload-115704
--- PASS: TestStartStop/group/no-preload/serial/Pause (1.89s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/FirstStart (54.27s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p embed-certs-120432 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=hyperkit  --kubernetes-version=v1.25.3
E1128 12:04:54.220093   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/flannel-113537/client.crt: no such file or directory
E1128 12:04:58.632577   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/bridge-113537/client.crt: no such file or directory
E1128 12:05:17.242954   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
E1128 12:05:21.871175   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/skaffold-113417/client.crt: no such file or directory
E1128 12:05:26.323728   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/bridge-113537/client.crt: no such file or directory
start_stop_delete_test.go:186: (dbg) Done: out/minikube-darwin-amd64 start -p embed-certs-120432 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=hyperkit  --kubernetes-version=v1.25.3: (54.268514429s)
--- PASS: TestStartStop/group/embed-certs/serial/FirstStart (54.27s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/DeployApp (13.26s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context embed-certs-120432 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) TestStartStop/group/embed-certs/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:342: "busybox" [6f6eae3d-1b39-4506-a53b-6f40b6afc390] Pending
helpers_test.go:342: "busybox" [6f6eae3d-1b39-4506-a53b-6f40b6afc390] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:342: "busybox" [6f6eae3d-1b39-4506-a53b-6f40b6afc390] Running
start_stop_delete_test.go:196: (dbg) TestStartStop/group/embed-certs/serial/DeployApp: integration-test=busybox healthy within 13.019474573s
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context embed-certs-120432 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/embed-certs/serial/DeployApp (13.26s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/EnableAddonWhileActive (0.66s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p embed-certs-120432 --images=MetricsServer=k8s.gcr.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context embed-certs-120432 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/embed-certs/serial/EnableAddonWhileActive (0.66s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/Stop (8.23s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 stop -p embed-certs-120432 --alsologtostderr -v=3
start_stop_delete_test.go:228: (dbg) Done: out/minikube-darwin-amd64 stop -p embed-certs-120432 --alsologtostderr -v=3: (8.228669303s)
--- PASS: TestStartStop/group/embed-certs/serial/Stop (8.23s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/EnableAddonAfterStop (0.39s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p embed-certs-120432 -n embed-certs-120432
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p embed-certs-120432 -n embed-certs-120432: exit status 7 (66.897755ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p embed-certs-120432 --images=MetricsScraper=k8s.gcr.io/echoserver:1.4
--- PASS: TestStartStop/group/embed-certs/serial/EnableAddonAfterStop (0.39s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/SecondStart (315.59s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p embed-certs-120432 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=hyperkit  --kubernetes-version=v1.25.3
E1128 12:05:50.742616   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kubenet-113537/client.crt: no such file or directory
E1128 12:06:06.881134   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/custom-flannel-113537/client.crt: no such file or directory
E1128 12:06:18.430997   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kubenet-113537/client.crt: no such file or directory
E1128 12:06:46.475301   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
E1128 12:06:54.999610   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/auto-113537/client.crt: no such file or directory
E1128 12:07:20.605511   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/false-113537/client.crt: no such file or directory
E1128 12:07:25.900092   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kindnet-113537/client.crt: no such file or directory
E1128 12:08:02.047224   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/calico-113537/client.crt: no such file or directory
E1128 12:08:09.533534   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
E1128 12:08:18.052476   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/auto-113537/client.crt: no such file or directory
E1128 12:08:20.658088   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/addons-104436/client.crt: no such file or directory
E1128 12:08:22.708836   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/no-preload-115704/client.crt: no such file or directory
E1128 12:08:22.714975   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/no-preload-115704/client.crt: no such file or directory
E1128 12:08:22.725234   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/no-preload-115704/client.crt: no such file or directory
E1128 12:08:22.746331   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/no-preload-115704/client.crt: no such file or directory
E1128 12:08:22.786705   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/no-preload-115704/client.crt: no such file or directory
E1128 12:08:22.868379   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/no-preload-115704/client.crt: no such file or directory
E1128 12:08:23.028602   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/no-preload-115704/client.crt: no such file or directory
E1128 12:08:23.349666   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/no-preload-115704/client.crt: no such file or directory
E1128 12:08:23.991393   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/no-preload-115704/client.crt: no such file or directory
E1128 12:08:25.272635   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/no-preload-115704/client.crt: no such file or directory
E1128 12:08:27.833767   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/no-preload-115704/client.crt: no such file or directory
E1128 12:08:32.954967   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/no-preload-115704/client.crt: no such file or directory
E1128 12:08:40.036799   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/enable-default-cni-113537/client.crt: no such file or directory
E1128 12:08:43.196894   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/no-preload-115704/client.crt: no such file or directory
E1128 12:08:48.944371   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kindnet-113537/client.crt: no such file or directory

                                                
                                                
=== CONT  TestStartStop/group/embed-certs/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Done: out/minikube-darwin-amd64 start -p embed-certs-120432 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=hyperkit  --kubernetes-version=v1.25.3: (5m15.408532664s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p embed-certs-120432 -n embed-certs-120432
--- PASS: TestStartStop/group/embed-certs/serial/SecondStart (315.59s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop (5.01s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-84b68f675b-49cpn" [01d64409-7408-44db-9b64-fcb6ecbc10bb] Running
E1128 12:08:54.195127   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/functional-105100/client.crt: no such file or directory
E1128 12:08:57.048756   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/cilium-113537/client.crt: no such file or directory
start_stop_delete_test.go:274: (dbg) TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.012337483s
--- PASS: TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop (5.01s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop (5.06s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-84b68f675b-49cpn" [01d64409-7408-44db-9b64-fcb6ecbc10bb] Running
start_stop_delete_test.go:287: (dbg) TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.005033018s
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context old-k8s-version-115518 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop (5.06s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages (0.16s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-darwin-amd64 ssh -p old-k8s-version-115518 "sudo crictl images -o json"
start_stop_delete_test.go:304: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
--- PASS: TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages (0.16s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/Pause (1.77s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 pause -p old-k8s-version-115518 --alsologtostderr -v=1
E1128 12:09:03.677795   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/no-preload-115704/client.crt: no such file or directory
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p old-k8s-version-115518 -n old-k8s-version-115518
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p old-k8s-version-115518 -n old-k8s-version-115518: exit status 2 (154.863138ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p old-k8s-version-115518 -n old-k8s-version-115518
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p old-k8s-version-115518 -n old-k8s-version-115518: exit status 2 (147.376809ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 unpause -p old-k8s-version-115518 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p old-k8s-version-115518 -n old-k8s-version-115518
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p old-k8s-version-115518 -n old-k8s-version-115518
--- PASS: TestStartStop/group/old-k8s-version/serial/Pause (1.77s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/FirstStart (56.52s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p default-k8s-diff-port-120911 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.25.3
E1128 12:09:26.556715   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/flannel-113537/client.crt: no such file or directory
E1128 12:09:44.668766   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/no-preload-115704/client.crt: no such file or directory
E1128 12:09:58.670794   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/bridge-113537/client.crt: no such file or directory
start_stop_delete_test.go:186: (dbg) Done: out/minikube-darwin-amd64 start -p default-k8s-diff-port-120911 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.25.3: (56.515503895s)
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/FirstStart (56.52s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/DeployApp (13.26s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context default-k8s-diff-port-120911 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) TestStartStop/group/default-k8s-diff-port/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:342: "busybox" [badc05cd-1367-493b-ba80-3c45ad5f13ca] Pending
helpers_test.go:342: "busybox" [badc05cd-1367-493b-ba80-3c45ad5f13ca] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:342: "busybox" [badc05cd-1367-493b-ba80-3c45ad5f13ca] Running
E1128 12:10:20.192126   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/cilium-113537/client.crt: no such file or directory
start_stop_delete_test.go:196: (dbg) TestStartStop/group/default-k8s-diff-port/serial/DeployApp: integration-test=busybox healthy within 13.014866324s
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context default-k8s-diff-port-120911 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/DeployApp (13.26s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/EnableAddonWhileActive (0.64s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p default-k8s-diff-port-120911 --images=MetricsServer=k8s.gcr.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context default-k8s-diff-port-120911 describe deploy/metrics-server -n kube-system
E1128 12:10:21.911022   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/skaffold-113417/client.crt: no such file or directory
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/EnableAddonWhileActive (0.64s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/Stop (3.22s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 stop -p default-k8s-diff-port-120911 --alsologtostderr -v=3
start_stop_delete_test.go:228: (dbg) Done: out/minikube-darwin-amd64 stop -p default-k8s-diff-port-120911 --alsologtostderr -v=3: (3.215209388s)
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/Stop (3.22s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/EnableAddonAfterStop (0.3s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-diff-port-120911 -n default-k8s-diff-port-120911
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-diff-port-120911 -n default-k8s-diff-port-120911: exit status 7 (66.911451ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p default-k8s-diff-port-120911 --images=MetricsScraper=k8s.gcr.io/echoserver:1.4
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/EnableAddonAfterStop (0.30s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/SecondStart (311.99s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p default-k8s-diff-port-120911 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.25.3
E1128 12:10:50.781573   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kubenet-113537/client.crt: no such file or directory
E1128 12:11:04.261478   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/old-k8s-version-115518/client.crt: no such file or directory
E1128 12:11:04.267230   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/old-k8s-version-115518/client.crt: no such file or directory
E1128 12:11:04.277972   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/old-k8s-version-115518/client.crt: no such file or directory
E1128 12:11:04.298220   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/old-k8s-version-115518/client.crt: no such file or directory
E1128 12:11:04.338280   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/old-k8s-version-115518/client.crt: no such file or directory
E1128 12:11:04.418414   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/old-k8s-version-115518/client.crt: no such file or directory
E1128 12:11:04.578715   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/old-k8s-version-115518/client.crt: no such file or directory
E1128 12:11:04.899809   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/old-k8s-version-115518/client.crt: no such file or directory

                                                
                                                
=== CONT  TestStartStop/group/default-k8s-diff-port/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Done: out/minikube-darwin-amd64 start -p default-k8s-diff-port-120911 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.25.3: (5m11.762890121s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-diff-port-120911 -n default-k8s-diff-port-120911
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/SecondStart (311.99s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop (9.01s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-f87d45d87-w8xxs" [03e043af-71b8-4134-b32e-11504ba46857] Pending / Ready:ContainersNotReady (containers with unready status: [kubernetes-dashboard]) / ContainersReady:ContainersNotReady (containers with unready status: [kubernetes-dashboard])
E1128 12:11:05.541261   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/old-k8s-version-115518/client.crt: no such file or directory
E1128 12:11:06.595844   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/no-preload-115704/client.crt: no such file or directory
E1128 12:11:06.822829   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/old-k8s-version-115518/client.crt: no such file or directory
E1128 12:11:06.921390   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/custom-flannel-113537/client.crt: no such file or directory
helpers_test.go:342: "kubernetes-dashboard-f87d45d87-w8xxs" [03e043af-71b8-4134-b32e-11504ba46857] Running
E1128 12:11:09.384555   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/old-k8s-version-115518/client.crt: no such file or directory
start_stop_delete_test.go:274: (dbg) TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 9.011690651s
--- PASS: TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop (9.01s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/AddonExistsAfterStop (5.06s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-f87d45d87-w8xxs" [03e043af-71b8-4134-b32e-11504ba46857] Running
E1128 12:11:14.504830   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/old-k8s-version-115518/client.crt: no such file or directory
start_stop_delete_test.go:287: (dbg) TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.005449435s
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context embed-certs-120432 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/embed-certs/serial/AddonExistsAfterStop (5.06s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/VerifyKubernetesImages (0.17s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-darwin-amd64 ssh -p embed-certs-120432 "sudo crictl images -o json"
start_stop_delete_test.go:304: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
--- PASS: TestStartStop/group/embed-certs/serial/VerifyKubernetesImages (0.17s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/Pause (1.91s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 pause -p embed-certs-120432 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p embed-certs-120432 -n embed-certs-120432
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p embed-certs-120432 -n embed-certs-120432: exit status 2 (157.003862ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p embed-certs-120432 -n embed-certs-120432
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p embed-certs-120432 -n embed-certs-120432: exit status 2 (158.650058ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 unpause -p embed-certs-120432 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p embed-certs-120432 -n embed-certs-120432
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p embed-certs-120432 -n embed-certs-120432
--- PASS: TestStartStop/group/embed-certs/serial/Pause (1.91s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/FirstStart (52.95s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p newest-cni-121127 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=192.168.111.111/16 --driver=hyperkit  --kubernetes-version=v1.25.3
E1128 12:11:45.228214   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/old-k8s-version-115518/client.crt: no such file or directory
E1128 12:11:46.515778   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/ingress-addon-legacy-105515/client.crt: no such file or directory
E1128 12:11:55.040011   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/auto-113537/client.crt: no such file or directory
start_stop_delete_test.go:186: (dbg) Done: out/minikube-darwin-amd64 start -p newest-cni-121127 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=192.168.111.111/16 --driver=hyperkit  --kubernetes-version=v1.25.3: (52.94507311s)
--- PASS: TestStartStop/group/newest-cni/serial/FirstStart (52.95s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/DeployApp (0s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/DeployApp
--- PASS: TestStartStop/group/newest-cni/serial/DeployApp (0.00s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/EnableAddonWhileActive (1.21s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p newest-cni-121127 --images=MetricsServer=k8s.gcr.io/echoserver:1.4 --registries=MetricsServer=fake.domain
E1128 12:12:20.647722   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/false-113537/client.crt: no such file or directory
start_stop_delete_test.go:205: (dbg) Done: out/minikube-darwin-amd64 addons enable metrics-server -p newest-cni-121127 --images=MetricsServer=k8s.gcr.io/echoserver:1.4 --registries=MetricsServer=fake.domain: (1.211971199s)
start_stop_delete_test.go:211: WARNING: cni mode requires additional setup before pods can schedule :(
--- PASS: TestStartStop/group/newest-cni/serial/EnableAddonWhileActive (1.21s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/Stop (8.25s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 stop -p newest-cni-121127 --alsologtostderr -v=3
E1128 12:12:25.940469   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kindnet-113537/client.crt: no such file or directory
E1128 12:12:26.189910   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/old-k8s-version-115518/client.crt: no such file or directory
start_stop_delete_test.go:228: (dbg) Done: out/minikube-darwin-amd64 stop -p newest-cni-121127 --alsologtostderr -v=3: (8.250696578s)
--- PASS: TestStartStop/group/newest-cni/serial/Stop (8.25s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/EnableAddonAfterStop (0.3s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p newest-cni-121127 -n newest-cni-121127
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p newest-cni-121127 -n newest-cni-121127: exit status 7 (64.681252ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p newest-cni-121127 --images=MetricsScraper=k8s.gcr.io/echoserver:1.4
--- PASS: TestStartStop/group/newest-cni/serial/EnableAddonAfterStop (0.30s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/SecondStart (31.7s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p newest-cni-121127 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=192.168.111.111/16 --driver=hyperkit  --kubernetes-version=v1.25.3
E1128 12:12:29.998950   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/custom-flannel-113537/client.crt: no such file or directory
start_stop_delete_test.go:256: (dbg) Done: out/minikube-darwin-amd64 start -p newest-cni-121127 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=192.168.111.111/16 --driver=hyperkit  --kubernetes-version=v1.25.3: (31.512494851s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p newest-cni-121127 -n newest-cni-121127
--- PASS: TestStartStop/group/newest-cni/serial/SecondStart (31.70s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop (0s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop
start_stop_delete_test.go:273: WARNING: cni mode requires additional setup before pods can schedule :(
--- PASS: TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop (0.00s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/AddonExistsAfterStop (0s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/AddonExistsAfterStop
start_stop_delete_test.go:284: WARNING: cni mode requires additional setup before pods can schedule :(
--- PASS: TestStartStop/group/newest-cni/serial/AddonExistsAfterStop (0.00s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/VerifyKubernetesImages (0.17s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-darwin-amd64 ssh -p newest-cni-121127 "sudo crictl images -o json"
--- PASS: TestStartStop/group/newest-cni/serial/VerifyKubernetesImages (0.17s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/Pause (1.81s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 pause -p newest-cni-121127 --alsologtostderr -v=1
E1128 12:13:02.087470   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/calico-113537/client.crt: no such file or directory
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p newest-cni-121127 -n newest-cni-121127
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p newest-cni-121127 -n newest-cni-121127: exit status 2 (168.605087ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p newest-cni-121127 -n newest-cni-121127
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p newest-cni-121127 -n newest-cni-121127: exit status 2 (160.019363ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 unpause -p newest-cni-121127 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p newest-cni-121127 -n newest-cni-121127
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p newest-cni-121127 -n newest-cni-121127
--- PASS: TestStartStop/group/newest-cni/serial/Pause (1.81s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop (12.01s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-f87d45d87-2g2lq" [5e288c27-8b20-4e37-94b5-97669fe84ce1] Pending
helpers_test.go:342: "kubernetes-dashboard-f87d45d87-2g2lq" [5e288c27-8b20-4e37-94b5-97669fe84ce1] Pending / Ready:ContainersNotReady (containers with unready status: [kubernetes-dashboard]) / ContainersReady:ContainersNotReady (containers with unready status: [kubernetes-dashboard])
helpers_test.go:342: "kubernetes-dashboard-f87d45d87-2g2lq" [5e288c27-8b20-4e37-94b5-97669fe84ce1] Running
start_stop_delete_test.go:274: (dbg) TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 12.01136554s
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop (12.01s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop (5.06s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-f87d45d87-2g2lq" [5e288c27-8b20-4e37-94b5-97669fe84ce1] Running
E1128 12:15:49.628221   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/flannel-113537/client.crt: no such file or directory
E1128 12:15:50.787665   15823 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/15411-14646/.minikube/profiles/kubenet-113537/client.crt: no such file or directory
start_stop_delete_test.go:287: (dbg) TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.007480761s
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context default-k8s-diff-port-120911 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop (5.06s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/VerifyKubernetesImages (0.17s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-darwin-amd64 ssh -p default-k8s-diff-port-120911 "sudo crictl images -o json"
start_stop_delete_test.go:304: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/VerifyKubernetesImages (0.17s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/Pause (1.84s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 pause -p default-k8s-diff-port-120911 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p default-k8s-diff-port-120911 -n default-k8s-diff-port-120911
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p default-k8s-diff-port-120911 -n default-k8s-diff-port-120911: exit status 2 (157.491418ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p default-k8s-diff-port-120911 -n default-k8s-diff-port-120911
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p default-k8s-diff-port-120911 -n default-k8s-diff-port-120911: exit status 2 (151.053737ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 unpause -p default-k8s-diff-port-120911 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p default-k8s-diff-port-120911 -n default-k8s-diff-port-120911
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p default-k8s-diff-port-120911 -n default-k8s-diff-port-120911
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/Pause (1.84s)

                                                
                                    

Test skip (16/301)

x
+
TestDownloadOnly/v1.16.0/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/cached-images
aaa_download_only_test.go:121: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.16.0/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.16.0/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/binaries
aaa_download_only_test.go:140: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.16.0/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.25.3/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.25.3/cached-images
aaa_download_only_test.go:121: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.25.3/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.25.3/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.25.3/binaries
aaa_download_only_test.go:140: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.25.3/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnlyKic (0s)

                                                
                                                
=== RUN   TestDownloadOnlyKic
aaa_download_only_test.go:214: skipping, only for docker or podman driver
--- SKIP: TestDownloadOnlyKic (0.00s)

                                                
                                    
x
+
TestAddons/parallel/Olm (0s)

                                                
                                                
=== RUN   TestAddons/parallel/Olm
=== PAUSE TestAddons/parallel/Olm

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Olm
addons_test.go:451: Skipping OLM addon test until https://github.com/operator-framework/operator-lifecycle-manager/issues/2534 is resolved
--- SKIP: TestAddons/parallel/Olm (0.00s)

                                                
                                    
x
+
TestKVMDriverInstallOrUpdate (0s)

                                                
                                                
=== RUN   TestKVMDriverInstallOrUpdate
driver_install_or_update_test.go:41: Skip if not linux.
--- SKIP: TestKVMDriverInstallOrUpdate (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/PodmanEnv (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/PodmanEnv
=== PAUSE TestFunctional/parallel/PodmanEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PodmanEnv
functional_test.go:543: only validate podman env with docker container runtime, currently testing docker
--- SKIP: TestFunctional/parallel/PodmanEnv (0.00s)

                                                
                                    
x
+
TestGvisorAddon (0s)

                                                
                                                
=== RUN   TestGvisorAddon
gvisor_addon_test.go:34: skipping test because --gvisor=false
--- SKIP: TestGvisorAddon (0.00s)

                                                
                                    
x
+
TestKicCustomNetwork (0s)

                                                
                                                
=== RUN   TestKicCustomNetwork
kic_custom_network_test.go:34: only runs with docker driver
--- SKIP: TestKicCustomNetwork (0.00s)

                                                
                                    
x
+
TestKicExistingNetwork (0s)

                                                
                                                
=== RUN   TestKicExistingNetwork
kic_custom_network_test.go:73: only runs with docker driver
--- SKIP: TestKicExistingNetwork (0.00s)

                                                
                                    
x
+
TestKicCustomSubnet (0s)

                                                
                                                
=== RUN   TestKicCustomSubnet
kic_custom_network_test.go:102: only runs with docker/podman driver
--- SKIP: TestKicCustomSubnet (0.00s)

                                                
                                    
x
+
TestScheduledStopWindows (0s)

                                                
                                                
=== RUN   TestScheduledStopWindows
scheduled_stop_test.go:42: test only runs on windows
--- SKIP: TestScheduledStopWindows (0.00s)

                                                
                                    
x
+
TestInsufficientStorage (0s)

                                                
                                                
=== RUN   TestInsufficientStorage
status_test.go:38: only runs with docker driver
--- SKIP: TestInsufficientStorage (0.00s)

                                                
                                    
x
+
TestMissingContainerUpgrade (0s)

                                                
                                                
=== RUN   TestMissingContainerUpgrade
version_upgrade_test.go:291: This test is only for Docker
--- SKIP: TestMissingContainerUpgrade (0.00s)

                                                
                                    
x
+
TestStartStop/group/disable-driver-mounts (0.41s)

                                                
                                                
=== RUN   TestStartStop/group/disable-driver-mounts
=== PAUSE TestStartStop/group/disable-driver-mounts

                                                
                                                

                                                
                                                
=== CONT  TestStartStop/group/disable-driver-mounts
start_stop_delete_test.go:103: skipping TestStartStop/group/disable-driver-mounts - only runs on virtualbox
helpers_test.go:175: Cleaning up "disable-driver-mounts-120911" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p disable-driver-mounts-120911
--- SKIP: TestStartStop/group/disable-driver-mounts (0.41s)

                                                
                                    
Copied to clipboard