Test Report: Hyperkit_macOS 14995

                    
                      411d4579fd248fd57a4259437564c3e08f354535:2022-09-21:25810
                    
                

Test fail (2/299)

Order failed test Duration
247 TestPause/serial/SecondStartNoReconfiguration 79.73
291 TestNetworkPlugins/group/kubenet/HairPin 57.31
x
+
TestPause/serial/SecondStartNoReconfiguration (79.73s)

                                                
                                                
=== RUN   TestPause/serial/SecondStartNoReconfiguration
pause_test.go:92: (dbg) Run:  out/minikube-darwin-amd64 start -p pause-20220921152522-3535 --alsologtostderr -v=1 --driver=hyperkit 
E0921 15:26:22.139035    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/skaffold-20220921151524-3535/client.crt: no such file or directory
E0921 15:26:49.831517    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/skaffold-20220921151524-3535/client.crt: no such file or directory
pause_test.go:92: (dbg) Done: out/minikube-darwin-amd64 start -p pause-20220921152522-3535 --alsologtostderr -v=1 --driver=hyperkit : (1m13.31014205s)
pause_test.go:100: expected the second start log output to include "The running cluster does not require reconfiguration" but got: 
-- stdout --
	* [pause-20220921152522-3535] minikube v1.27.0 on Darwin 12.6
	  - MINIKUBE_LOCATION=14995
	  - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube
	* Using the hyperkit driver based on existing profile
	* Starting control plane node pause-20220921152522-3535 in cluster pause-20220921152522-3535
	* Updating the running hyperkit "pause-20220921152522-3535" VM ...
	* Preparing Kubernetes v1.25.2 on Docker 20.10.18 ...
	* Verifying Kubernetes components...
	  - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	* Enabled addons: storage-provisioner, default-storageclass
	* Done! kubectl is now configured to use "pause-20220921152522-3535" cluster and "default" namespace by default

                                                
                                                
-- /stdout --
** stderr ** 
	I0921 15:26:16.412297   10408 out.go:296] Setting OutFile to fd 1 ...
	I0921 15:26:16.412857   10408 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0921 15:26:16.412883   10408 out.go:309] Setting ErrFile to fd 2...
	I0921 15:26:16.412925   10408 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0921 15:26:16.413172   10408 root.go:333] Updating PATH: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/bin
	I0921 15:26:16.413935   10408 out.go:303] Setting JSON to false
	I0921 15:26:16.429337   10408 start.go:115] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":5147,"bootTime":1663794029,"procs":382,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"12.6","kernelVersion":"21.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0921 15:26:16.429439   10408 start.go:123] gopshost.Virtualization returned error: not implemented yet
	I0921 15:26:16.451061   10408 out.go:177] * [pause-20220921152522-3535] minikube v1.27.0 on Darwin 12.6
	I0921 15:26:16.492895   10408 notify.go:214] Checking for updates...
	I0921 15:26:16.513942   10408 out.go:177]   - MINIKUBE_LOCATION=14995
	I0921 15:26:16.535147   10408 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/kubeconfig
	I0921 15:26:16.555899   10408 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0921 15:26:16.577004   10408 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0921 15:26:16.598036   10408 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube
	I0921 15:26:16.619232   10408 config.go:180] Loaded profile config "pause-20220921152522-3535": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.2
	I0921 15:26:16.619572   10408 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:26:16.619620   10408 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:26:16.626042   10408 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:52950
	I0921 15:26:16.626541   10408 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:26:16.626992   10408 main.go:134] libmachine: Using API Version  1
	I0921 15:26:16.627004   10408 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:26:16.627211   10408 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:26:16.627372   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:16.627501   10408 driver.go:365] Setting default libvirt URI to qemu:///system
	I0921 15:26:16.627783   10408 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:26:16.627806   10408 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:26:16.634000   10408 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:52952
	I0921 15:26:16.634367   10408 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:26:16.634679   10408 main.go:134] libmachine: Using API Version  1
	I0921 15:26:16.634691   10408 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:26:16.634960   10408 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:26:16.635067   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:16.661930   10408 out.go:177] * Using the hyperkit driver based on existing profile
	I0921 15:26:16.703890   10408 start.go:284] selected driver: hyperkit
	I0921 15:26:16.703910   10408 start.go:808] validating driver "hyperkit" against &{Name:pause-20220921152522-3535 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube/iso/minikube-v1.27.0-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase:v0.0.34@sha256:f2a1e577e43fd6769f35cdb938f6d21c3dacfd763062d119cade738fa244720c Memory:2048 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.25.2 ClusterNam
e:pause-20220921152522-3535 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP:192.168.64.28 Port:8443 KubernetesVersion:v1.25.2 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort
:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath:/opt/socket_vmnet/bin/socket_vmnet_client SocketVMnetPath:/var/run/socket_vmnet}
	I0921 15:26:16.704025   10408 start.go:819] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0921 15:26:16.704092   10408 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0921 15:26:16.704203   10408 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0921 15:26:16.710571   10408 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.27.0
	I0921 15:26:16.713621   10408 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:26:16.713649   10408 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0921 15:26:16.715630   10408 cni.go:95] Creating CNI manager for ""
	I0921 15:26:16.715647   10408 cni.go:169] CNI unnecessary in this configuration, recommending no CNI
	I0921 15:26:16.715664   10408 start_flags.go:316] config:
	{Name:pause-20220921152522-3535 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube/iso/minikube-v1.27.0-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase:v0.0.34@sha256:f2a1e577e43fd6769f35cdb938f6d21c3dacfd763062d119cade738fa244720c Memory:2048 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.25.2 ClusterName:pause-20220921152522-3535 Namespace:default APIServerName:minikubeCA APIServerNa
mes:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP:192.168.64.28 Port:8443 KubernetesVersion:v1.25.2 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMe
trics:false CustomQemuFirmwarePath: SocketVMnetClientPath:/opt/socket_vmnet/bin/socket_vmnet_client SocketVMnetPath:/var/run/socket_vmnet}
	I0921 15:26:16.715818   10408 iso.go:124] acquiring lock: {Name:mke8c57399926d29e846b47dd4be4625ba5fcaea Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0921 15:26:16.774023   10408 out.go:177] * Starting control plane node pause-20220921152522-3535 in cluster pause-20220921152522-3535
	I0921 15:26:16.794876   10408 preload.go:132] Checking if preload exists for k8s version v1.25.2 and runtime docker
	I0921 15:26:16.794956   10408 preload.go:148] Found local preload: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.25.2-docker-overlay2-amd64.tar.lz4
	I0921 15:26:16.795012   10408 cache.go:57] Caching tarball of preloaded images
	I0921 15:26:16.795122   10408 preload.go:174] Found /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.25.2-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0921 15:26:16.795144   10408 cache.go:60] Finished verifying existence of preloaded tar for  v1.25.2 on docker
	I0921 15:26:16.795239   10408 profile.go:148] Saving config to /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/config.json ...
	I0921 15:26:16.795594   10408 cache.go:208] Successfully downloaded all kic artifacts
	I0921 15:26:16.795620   10408 start.go:364] acquiring machines lock for pause-20220921152522-3535: {Name:mk2f7774d81f069136708da9f7558413d7930511 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0921 15:26:19.803647   10408 start.go:368] acquired machines lock for "pause-20220921152522-3535" in 3.008011859s
	I0921 15:26:19.803693   10408 start.go:96] Skipping create...Using existing machine configuration
	I0921 15:26:19.803704   10408 fix.go:55] fixHost starting: 
	I0921 15:26:19.804014   10408 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:26:19.804040   10408 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:26:19.810489   10408 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:52975
	I0921 15:26:19.810845   10408 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:26:19.811156   10408 main.go:134] libmachine: Using API Version  1
	I0921 15:26:19.811167   10408 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:26:19.811357   10408 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:26:19.811458   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:19.811557   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetState
	I0921 15:26:19.811664   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0921 15:26:19.811739   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | hyperkit pid from json: 10295
	I0921 15:26:19.812542   10408 fix.go:103] recreateIfNeeded on pause-20220921152522-3535: state=Running err=<nil>
	W0921 15:26:19.812564   10408 fix.go:129] unexpected machine state, will restart: <nil>
	I0921 15:26:19.835428   10408 out.go:177] * Updating the running hyperkit "pause-20220921152522-3535" VM ...
	I0921 15:26:19.856170   10408 machine.go:88] provisioning docker machine ...
	I0921 15:26:19.856192   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:19.856377   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetMachineName
	I0921 15:26:19.856478   10408 buildroot.go:166] provisioning hostname "pause-20220921152522-3535"
	I0921 15:26:19.856489   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetMachineName
	I0921 15:26:19.856574   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:19.856646   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:26:19.856744   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.856835   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.856914   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:26:19.857028   10408 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:19.857193   10408 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.28 22 <nil> <nil>}
	I0921 15:26:19.857203   10408 main.go:134] libmachine: About to run SSH command:
	sudo hostname pause-20220921152522-3535 && echo "pause-20220921152522-3535" | sudo tee /etc/hostname
	I0921 15:26:19.929633   10408 main.go:134] libmachine: SSH cmd err, output: <nil>: pause-20220921152522-3535
	
	I0921 15:26:19.929693   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:19.929883   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:26:19.930020   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.930143   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.930253   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:26:19.930438   10408 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:19.930577   10408 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.28 22 <nil> <nil>}
	I0921 15:26:19.930595   10408 main.go:134] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\spause-20220921152522-3535' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 pause-20220921152522-3535/g' /etc/hosts;
				else 
					echo '127.0.1.1 pause-20220921152522-3535' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0921 15:26:19.992780   10408 main.go:134] libmachine: SSH cmd err, output: <nil>: 
	I0921 15:26:19.992803   10408 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube CaCertPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/key.pem ServerCertRemotePath
:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube}
	I0921 15:26:19.992832   10408 buildroot.go:174] setting up certificates
	I0921 15:26:19.992843   10408 provision.go:83] configureAuth start
	I0921 15:26:19.992852   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetMachineName
	I0921 15:26:19.993017   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetIP
	I0921 15:26:19.993132   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:19.993213   10408 provision.go:138] copyHostCerts
	I0921 15:26:19.993302   10408 exec_runner.go:144] found /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/cert.pem, removing ...
	I0921 15:26:19.993310   10408 exec_runner.go:207] rm: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/cert.pem
	I0921 15:26:19.993450   10408 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/cert.pem (1123 bytes)
	I0921 15:26:19.993643   10408 exec_runner.go:144] found /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/key.pem, removing ...
	I0921 15:26:19.993649   10408 exec_runner.go:207] rm: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/key.pem
	I0921 15:26:19.993780   10408 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/key.pem (1679 bytes)
	I0921 15:26:19.994087   10408 exec_runner.go:144] found /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.pem, removing ...
	I0921 15:26:19.994094   10408 exec_runner.go:207] rm: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.pem
	I0921 15:26:19.994203   10408 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.pem (1078 bytes)
	I0921 15:26:19.994341   10408 provision.go:112] generating server cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca-key.pem org=jenkins.pause-20220921152522-3535 san=[192.168.64.28 192.168.64.28 localhost 127.0.0.1 minikube pause-20220921152522-3535]
	I0921 15:26:20.145157   10408 provision.go:172] copyRemoteCerts
	I0921 15:26:20.145229   10408 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0921 15:26:20.145247   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:20.145395   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:26:20.145492   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.145591   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:26:20.145687   10408 sshutil.go:53] new ssh client: &{IP:192.168.64.28 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/pause-20220921152522-3535/id_rsa Username:docker}
	I0921 15:26:20.181860   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0921 15:26:20.204288   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/server.pem --> /etc/docker/server.pem (1249 bytes)
	I0921 15:26:20.223046   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0921 15:26:20.242859   10408 provision.go:86] duration metric: configureAuth took 250.000259ms
	I0921 15:26:20.242872   10408 buildroot.go:189] setting minikube options for container-runtime
	I0921 15:26:20.243031   10408 config.go:180] Loaded profile config "pause-20220921152522-3535": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.2
	I0921 15:26:20.243050   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:20.243218   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:20.243320   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:26:20.243440   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.243555   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.243661   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:26:20.243798   10408 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:20.243914   10408 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.28 22 <nil> <nil>}
	I0921 15:26:20.243922   10408 main.go:134] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0921 15:26:20.307004   10408 main.go:134] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0921 15:26:20.307030   10408 buildroot.go:70] root file system type: tmpfs
	I0921 15:26:20.307188   10408 provision.go:309] Updating docker unit: /lib/systemd/system/docker.service ...
	I0921 15:26:20.307206   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:20.307379   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:26:20.307501   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.307587   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.307679   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:26:20.307823   10408 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:20.307954   10408 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.28 22 <nil> <nil>}
	I0921 15:26:20.308011   10408 main.go:134] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0921 15:26:20.380017   10408 main.go:134] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0921 15:26:20.380044   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:20.380193   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:26:20.380302   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.380410   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.380514   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:26:20.380665   10408 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:20.380781   10408 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.28 22 <nil> <nil>}
	I0921 15:26:20.380797   10408 main.go:134] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0921 15:26:20.447616   10408 main.go:134] libmachine: SSH cmd err, output: <nil>: 
	I0921 15:26:20.447629   10408 machine.go:91] provisioned docker machine in 591.445478ms
	I0921 15:26:20.447641   10408 start.go:300] post-start starting for "pause-20220921152522-3535" (driver="hyperkit")
	I0921 15:26:20.447646   10408 start.go:328] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0921 15:26:20.447659   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:20.447885   10408 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0921 15:26:20.447901   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:20.448051   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:26:20.448156   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.448291   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:26:20.448405   10408 sshutil.go:53] new ssh client: &{IP:192.168.64.28 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/pause-20220921152522-3535/id_rsa Username:docker}
	I0921 15:26:20.484862   10408 ssh_runner.go:195] Run: cat /etc/os-release
	I0921 15:26:20.487726   10408 info.go:137] Remote host: Buildroot 2021.02.12
	I0921 15:26:20.487742   10408 filesync.go:126] Scanning /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/addons for local assets ...
	I0921 15:26:20.487867   10408 filesync.go:126] Scanning /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files for local assets ...
	I0921 15:26:20.488046   10408 filesync.go:149] local asset: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files/etc/ssl/certs/35352.pem -> 35352.pem in /etc/ssl/certs
	I0921 15:26:20.488202   10408 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0921 15:26:20.495074   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files/etc/ssl/certs/35352.pem --> /etc/ssl/certs/35352.pem (1708 bytes)
	I0921 15:26:20.515167   10408 start.go:303] post-start completed in 67.502258ms
	I0921 15:26:20.515187   10408 fix.go:57] fixHost completed within 711.484594ms
	I0921 15:26:20.515203   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:20.515368   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:26:20.515520   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.515638   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.515770   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:26:20.515941   10408 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:20.516053   10408 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.28 22 <nil> <nil>}
	I0921 15:26:20.516063   10408 main.go:134] libmachine: About to run SSH command:
	date +%s.%N
	I0921 15:26:20.577712   10408 main.go:134] libmachine: SSH cmd err, output: <nil>: 1663799180.686854068
	
	I0921 15:26:20.577735   10408 fix.go:207] guest clock: 1663799180.686854068
	I0921 15:26:20.577746   10408 fix.go:220] Guest: 2022-09-21 15:26:20.686854068 -0700 PDT Remote: 2022-09-21 15:26:20.51519 -0700 PDT m=+4.146234536 (delta=171.664068ms)
	I0921 15:26:20.577765   10408 fix.go:191] guest clock delta is within tolerance: 171.664068ms
	I0921 15:26:20.577770   10408 start.go:83] releasing machines lock for "pause-20220921152522-3535", held for 774.111447ms
	I0921 15:26:20.577789   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:20.577928   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetIP
	I0921 15:26:20.578042   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:20.578174   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:20.578318   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:20.578705   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:20.578809   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:20.578906   10408 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0921 15:26:20.578961   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:20.578984   10408 ssh_runner.go:195] Run: systemctl --version
	I0921 15:26:20.578999   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:20.579066   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:26:20.579106   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:26:20.579182   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.579228   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.579290   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:26:20.579338   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:26:20.579415   10408 sshutil.go:53] new ssh client: &{IP:192.168.64.28 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/pause-20220921152522-3535/id_rsa Username:docker}
	I0921 15:26:20.579448   10408 sshutil.go:53] new ssh client: &{IP:192.168.64.28 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/pause-20220921152522-3535/id_rsa Username:docker}
	I0921 15:26:20.650058   10408 preload.go:132] Checking if preload exists for k8s version v1.25.2 and runtime docker
	I0921 15:26:20.650150   10408 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0921 15:26:20.668593   10408 docker.go:611] Got preloaded images: -- stdout --
	registry.k8s.io/kube-apiserver:v1.25.2
	registry.k8s.io/kube-controller-manager:v1.25.2
	registry.k8s.io/kube-scheduler:v1.25.2
	registry.k8s.io/kube-proxy:v1.25.2
	registry.k8s.io/pause:3.8
	registry.k8s.io/etcd:3.5.4-0
	registry.k8s.io/coredns/coredns:v1.9.3
	k8s.gcr.io/pause:3.6
	gcr.io/k8s-minikube/storage-provisioner:v5
	
	-- /stdout --
	I0921 15:26:20.668610   10408 docker.go:542] Images already preloaded, skipping extraction
	I0921 15:26:20.668676   10408 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0921 15:26:20.679656   10408 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0921 15:26:20.692651   10408 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0921 15:26:20.702013   10408 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	image-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0921 15:26:20.715942   10408 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0921 15:26:20.844184   10408 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0921 15:26:20.974988   10408 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0921 15:26:21.117162   10408 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0921 15:26:29.173173   10408 ssh_runner.go:235] Completed: sudo systemctl restart docker: (8.055980768s)
	I0921 15:26:29.173240   10408 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0921 15:26:29.288535   10408 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0921 15:26:29.417731   10408 ssh_runner.go:195] Run: sudo systemctl start cri-docker.socket
	I0921 15:26:29.433270   10408 start.go:450] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0921 15:26:29.433356   10408 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0921 15:26:29.447293   10408 start.go:471] Will wait 60s for crictl version
	I0921 15:26:29.447353   10408 ssh_runner.go:195] Run: sudo crictl version
	I0921 15:26:29.482799   10408 start.go:480] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  20.10.18
	RuntimeApiVersion:  1.41.0
	I0921 15:26:29.482858   10408 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0921 15:26:29.651357   10408 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0921 15:26:29.808439   10408 out.go:204] * Preparing Kubernetes v1.25.2 on Docker 20.10.18 ...
	I0921 15:26:29.808534   10408 ssh_runner.go:195] Run: grep 192.168.64.1	host.minikube.internal$ /etc/hosts
	I0921 15:26:29.818111   10408 preload.go:132] Checking if preload exists for k8s version v1.25.2 and runtime docker
	I0921 15:26:29.818177   10408 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0921 15:26:29.873620   10408 docker.go:611] Got preloaded images: -- stdout --
	registry.k8s.io/kube-apiserver:v1.25.2
	registry.k8s.io/kube-scheduler:v1.25.2
	registry.k8s.io/kube-controller-manager:v1.25.2
	registry.k8s.io/kube-proxy:v1.25.2
	registry.k8s.io/pause:3.8
	registry.k8s.io/etcd:3.5.4-0
	registry.k8s.io/coredns/coredns:v1.9.3
	k8s.gcr.io/pause:3.6
	gcr.io/k8s-minikube/storage-provisioner:v5
	
	-- /stdout --
	I0921 15:26:29.873633   10408 docker.go:542] Images already preloaded, skipping extraction
	I0921 15:26:29.873699   10408 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0921 15:26:29.929931   10408 docker.go:611] Got preloaded images: -- stdout --
	registry.k8s.io/kube-apiserver:v1.25.2
	registry.k8s.io/kube-scheduler:v1.25.2
	registry.k8s.io/kube-controller-manager:v1.25.2
	registry.k8s.io/kube-proxy:v1.25.2
	registry.k8s.io/pause:3.8
	registry.k8s.io/etcd:3.5.4-0
	registry.k8s.io/coredns/coredns:v1.9.3
	k8s.gcr.io/pause:3.6
	gcr.io/k8s-minikube/storage-provisioner:v5
	
	-- /stdout --
	I0921 15:26:29.929952   10408 cache_images.go:84] Images are preloaded, skipping loading
	I0921 15:26:29.930056   10408 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0921 15:26:30.064287   10408 cni.go:95] Creating CNI manager for ""
	I0921 15:26:30.064305   10408 cni.go:169] CNI unnecessary in this configuration, recommending no CNI
	I0921 15:26:30.064320   10408 kubeadm.go:87] Using pod CIDR: 10.244.0.0/16
	I0921 15:26:30.064331   10408 kubeadm.go:156] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.64.28 APIServerPort:8443 KubernetesVersion:v1.25.2 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:pause-20220921152522-3535 NodeName:pause-20220921152522-3535 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.64.28"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.64.28 CgroupDriver:systemd ClientCAFile:/var/lib/minikube/certs/ca.cr
t StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false}
	I0921 15:26:30.064423   10408 kubeadm.go:161] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.64.28
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: /var/run/cri-dockerd.sock
	  name: "pause-20220921152522-3535"
	  kubeletExtraArgs:
	    node-ip: 192.168.64.28
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.64.28"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.25.2
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: systemd
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0921 15:26:30.064505   10408 kubeadm.go:962] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.25.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --container-runtime=remote --container-runtime-endpoint=/var/run/cri-dockerd.sock --hostname-override=pause-20220921152522-3535 --image-service-endpoint=/var/run/cri-dockerd.sock --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.64.28 --runtime-request-timeout=15m
	
	[Install]
	 config:
	{KubernetesVersion:v1.25.2 ClusterName:pause-20220921152522-3535 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:}
	I0921 15:26:30.064579   10408 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.25.2
	I0921 15:26:30.076550   10408 binaries.go:44] Found k8s binaries, skipping transfer
	I0921 15:26:30.076638   10408 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I0921 15:26:30.090012   10408 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (488 bytes)
	I0921 15:26:30.137803   10408 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0921 15:26:30.178146   10408 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2050 bytes)
	I0921 15:26:30.203255   10408 ssh_runner.go:195] Run: grep 192.168.64.28	control-plane.minikube.internal$ /etc/hosts
	I0921 15:26:30.209779   10408 certs.go:54] Setting up /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535 for IP: 192.168.64.28
	I0921 15:26:30.209879   10408 certs.go:182] skipping minikubeCA CA generation: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.key
	I0921 15:26:30.209934   10408 certs.go:182] skipping proxyClientCA CA generation: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/proxy-client-ca.key
	I0921 15:26:30.210019   10408 certs.go:298] skipping minikube-user signed cert generation: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/client.key
	I0921 15:26:30.210082   10408 certs.go:298] skipping minikube signed cert generation: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/apiserver.key.6733b561
	I0921 15:26:30.210133   10408 certs.go:298] skipping aggregator signed cert generation: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/proxy-client.key
	I0921 15:26:30.210333   10408 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/3535.pem (1338 bytes)
	W0921 15:26:30.210375   10408 certs.go:384] ignoring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/3535_empty.pem, impossibly tiny 0 bytes
	I0921 15:26:30.210388   10408 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca-key.pem (1679 bytes)
	I0921 15:26:30.210421   10408 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca.pem (1078 bytes)
	I0921 15:26:30.210453   10408 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/cert.pem (1123 bytes)
	I0921 15:26:30.210483   10408 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/key.pem (1679 bytes)
	I0921 15:26:30.210550   10408 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files/etc/ssl/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files/etc/ssl/certs/35352.pem (1708 bytes)
	I0921 15:26:30.211086   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1399 bytes)
	I0921 15:26:30.279069   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0921 15:26:30.343250   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0921 15:26:30.413180   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0921 15:26:30.448798   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0921 15:26:30.476175   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0921 15:26:30.497204   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0921 15:26:30.524103   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0921 15:26:30.558966   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files/etc/ssl/certs/35352.pem --> /usr/share/ca-certificates/35352.pem (1708 bytes)
	I0921 15:26:30.576319   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0921 15:26:30.592912   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/3535.pem --> /usr/share/ca-certificates/3535.pem (1338 bytes)
	I0921 15:26:30.609099   10408 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0921 15:26:30.627179   10408 ssh_runner.go:195] Run: openssl version
	I0921 15:26:30.632801   10408 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/3535.pem && ln -fs /usr/share/ca-certificates/3535.pem /etc/ssl/certs/3535.pem"
	I0921 15:26:30.641473   10408 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3535.pem
	I0921 15:26:30.645794   10408 certs.go:431] hashing: -rw-r--r-- 1 root root 1338 Sep 21 21:31 /usr/share/ca-certificates/3535.pem
	I0921 15:26:30.645836   10408 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3535.pem
	I0921 15:26:30.649794   10408 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/3535.pem /etc/ssl/certs/51391683.0"
	I0921 15:26:30.657630   10408 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/35352.pem && ln -fs /usr/share/ca-certificates/35352.pem /etc/ssl/certs/35352.pem"
	I0921 15:26:30.665747   10408 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/35352.pem
	I0921 15:26:30.669804   10408 certs.go:431] hashing: -rw-r--r-- 1 root root 1708 Sep 21 21:31 /usr/share/ca-certificates/35352.pem
	I0921 15:26:30.669850   10408 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/35352.pem
	I0921 15:26:30.679638   10408 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/35352.pem /etc/ssl/certs/3ec20f2e.0"
	I0921 15:26:30.700907   10408 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0921 15:26:30.734369   10408 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0921 15:26:30.762750   10408 certs.go:431] hashing: -rw-r--r-- 1 root root 1111 Sep 21 21:27 /usr/share/ca-certificates/minikubeCA.pem
	I0921 15:26:30.762827   10408 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0921 15:26:30.777627   10408 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0921 15:26:30.785856   10408 kubeadm.go:396] StartCluster: {Name:pause-20220921152522-3535 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube/iso/minikube-v1.27.0-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase:v0.0.34@sha256:f2a1e577e43fd6769f35cdb938f6d21c3dacfd763062d119cade738fa244720c Memory:2048 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.25.2 ClusterName:pause-20220921152522
-3535 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP:192.168.64.28 Port:8443 KubernetesVersion:v1.25.2 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountU
ID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath:/opt/socket_vmnet/bin/socket_vmnet_client SocketVMnetPath:/var/run/socket_vmnet}
	I0921 15:26:30.785963   10408 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0921 15:26:30.816264   10408 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0921 15:26:30.823179   10408 kubeadm.go:411] found existing configuration files, will attempt cluster restart
	I0921 15:26:30.823195   10408 kubeadm.go:627] restartCluster start
	I0921 15:26:30.823236   10408 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I0921 15:26:30.837045   10408 kubeadm.go:127] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I0921 15:26:30.837457   10408 kubeconfig.go:92] found "pause-20220921152522-3535" server: "https://192.168.64.28:8443"
	I0921 15:26:30.837839   10408 kapi.go:59] client config for pause-20220921152522-3535: &rest.Config{Host:"https://192.168.64.28:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/client.crt", KeyFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/cl
ient.key", CAFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x233b400), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0921 15:26:30.838375   10408 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I0921 15:26:30.852535   10408 api_server.go:165] Checking apiserver status ...
	I0921 15:26:30.852588   10408 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0921 15:26:30.868059   10408 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/4520/cgroup
	I0921 15:26:30.876185   10408 api_server.go:181] apiserver freezer: "2:freezer:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadc22aaa89e8234f176d6344e50152f4.slice/docker-3a4741e1fe3c0996cab4975bd514e9991794f86cf96c9fe0863c714a6d86e26c.scope"
	I0921 15:26:30.876238   10408 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadc22aaa89e8234f176d6344e50152f4.slice/docker-3a4741e1fe3c0996cab4975bd514e9991794f86cf96c9fe0863c714a6d86e26c.scope/freezer.state
	I0921 15:26:30.912452   10408 api_server.go:203] freezer state: "THAWED"
	I0921 15:26:30.912472   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:35.914013   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I0921 15:26:35.914061   10408 retry.go:31] will retry after 263.082536ms: state is "Stopped"
	I0921 15:26:36.179260   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:41.180983   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I0921 15:26:41.181007   10408 retry.go:31] will retry after 381.329545ms: state is "Stopped"
	I0921 15:26:41.563913   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:46.564586   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I0921 15:26:46.766257   10408 api_server.go:165] Checking apiserver status ...
	I0921 15:26:46.766358   10408 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0921 15:26:46.776615   10408 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/4520/cgroup
	I0921 15:26:46.782756   10408 api_server.go:181] apiserver freezer: "2:freezer:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadc22aaa89e8234f176d6344e50152f4.slice/docker-3a4741e1fe3c0996cab4975bd514e9991794f86cf96c9fe0863c714a6d86e26c.scope"
	I0921 15:26:46.782801   10408 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadc22aaa89e8234f176d6344e50152f4.slice/docker-3a4741e1fe3c0996cab4975bd514e9991794f86cf96c9fe0863c714a6d86e26c.scope/freezer.state
	I0921 15:26:46.789298   10408 api_server.go:203] freezer state: "THAWED"
	I0921 15:26:46.789309   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:51.288815   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": read tcp 192.168.64.1:52998->192.168.64.28:8443: read: connection reset by peer
	I0921 15:26:51.288848   10408 retry.go:31] will retry after 242.214273ms: state is "Stopped"
	I0921 15:26:51.532207   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:51.632400   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": dial tcp 192.168.64.28:8443: connect: connection refused
	I0921 15:26:51.632425   10408 retry.go:31] will retry after 300.724609ms: state is "Stopped"
	I0921 15:26:51.934415   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:52.035144   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": dial tcp 192.168.64.28:8443: connect: connection refused
	I0921 15:26:52.035176   10408 retry.go:31] will retry after 427.113882ms: state is "Stopped"
	I0921 15:26:52.464328   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:52.566391   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": dial tcp 192.168.64.28:8443: connect: connection refused
	I0921 15:26:52.566426   10408 retry.go:31] will retry after 382.2356ms: state is "Stopped"
	I0921 15:26:52.948987   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:53.049570   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": dial tcp 192.168.64.28:8443: connect: connection refused
	I0921 15:26:53.049605   10408 retry.go:31] will retry after 505.529557ms: state is "Stopped"
	I0921 15:26:53.556334   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:53.658245   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": dial tcp 192.168.64.28:8443: connect: connection refused
	I0921 15:26:53.658268   10408 retry.go:31] will retry after 609.195524ms: state is "Stopped"
	I0921 15:26:54.269593   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:54.371296   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": dial tcp 192.168.64.28:8443: connect: connection refused
	I0921 15:26:54.371340   10408 retry.go:31] will retry after 858.741692ms: state is "Stopped"
	I0921 15:26:55.230116   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:55.331214   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": dial tcp 192.168.64.28:8443: connect: connection refused
	I0921 15:26:55.331251   10408 retry.go:31] will retry after 1.201160326s: state is "Stopped"
	I0921 15:26:56.533116   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:56.635643   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": dial tcp 192.168.64.28:8443: connect: connection refused
	I0921 15:26:56.635670   10408 retry.go:31] will retry after 1.723796097s: state is "Stopped"
	I0921 15:26:58.359704   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:58.461478   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": dial tcp 192.168.64.28:8443: connect: connection refused
	I0921 15:26:58.461505   10408 retry.go:31] will retry after 1.596532639s: state is "Stopped"
	I0921 15:27:00.059136   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:27:00.159945   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": dial tcp 192.168.64.28:8443: connect: connection refused
	I0921 15:27:00.159971   10408 api_server.go:165] Checking apiserver status ...
	I0921 15:27:00.160018   10408 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0921 15:27:00.169632   10408 api_server.go:169] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0921 15:27:00.169647   10408 kubeadm.go:602] needs reconfigure: apiserver error: timed out waiting for the condition
	I0921 15:27:00.169656   10408 kubeadm.go:1114] stopping kube-system containers ...
	I0921 15:27:00.169722   10408 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0921 15:27:00.201882   10408 docker.go:443] Stopping containers: [d7cbc4c453b0 823942ffecb6 283fac289f86 c2e8fe8419a9 4934b6e15931 3a4741e1fe3c e1129956136e 3d0143698c2d 163c82f50ebf 994dd806c8bf eb1318ed7bcc 1a3e01fca571 5fc70456f2e3 54e273754edc 52c58a26f4cc 4ad5f51c22d6 3ac721feff71 bf1833cd9ccb 532325020c06 7d83f8f7d4ba b943e6acece0 25c3a0228e49]
	I0921 15:27:00.201952   10408 ssh_runner.go:195] Run: docker stop d7cbc4c453b0 823942ffecb6 283fac289f86 c2e8fe8419a9 4934b6e15931 3a4741e1fe3c e1129956136e 3d0143698c2d 163c82f50ebf 994dd806c8bf eb1318ed7bcc 1a3e01fca571 5fc70456f2e3 54e273754edc 52c58a26f4cc 4ad5f51c22d6 3ac721feff71 bf1833cd9ccb 532325020c06 7d83f8f7d4ba b943e6acece0 25c3a0228e49
	I0921 15:27:05.344188   10408 ssh_runner.go:235] Completed: docker stop d7cbc4c453b0 823942ffecb6 283fac289f86 c2e8fe8419a9 4934b6e15931 3a4741e1fe3c e1129956136e 3d0143698c2d 163c82f50ebf 994dd806c8bf eb1318ed7bcc 1a3e01fca571 5fc70456f2e3 54e273754edc 52c58a26f4cc 4ad5f51c22d6 3ac721feff71 bf1833cd9ccb 532325020c06 7d83f8f7d4ba b943e6acece0 25c3a0228e49: (5.142213633s)
	I0921 15:27:05.344244   10408 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I0921 15:27:05.419551   10408 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I0921 15:27:05.433375   10408 kubeadm.go:155] found existing configuration files:
	-rw------- 1 root root 5643 Sep 21 22:25 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5657 Sep 21 22:25 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 2039 Sep 21 22:25 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5601 Sep 21 22:25 /etc/kubernetes/scheduler.conf
	
	I0921 15:27:05.433432   10408 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I0921 15:27:05.439704   10408 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I0921 15:27:05.445874   10408 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I0921 15:27:05.453215   10408 kubeadm.go:166] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I0921 15:27:05.453270   10408 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I0921 15:27:05.459417   10408 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I0921 15:27:05.465309   10408 kubeadm.go:166] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I0921 15:27:05.465358   10408 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I0921 15:27:05.476008   10408 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I0921 15:27:05.484410   10408 kubeadm.go:704] reconfiguring cluster from /var/tmp/minikube/kubeadm.yaml
	I0921 15:27:05.484426   10408 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.2:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I0921 15:27:05.534434   10408 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.2:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I0921 15:27:06.469884   10408 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.2:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I0921 15:27:06.628867   10408 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.2:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I0921 15:27:06.698897   10408 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.2:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I0921 15:27:06.759299   10408 api_server.go:51] waiting for apiserver process to appear ...
	I0921 15:27:06.759353   10408 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0921 15:27:06.778540   10408 api_server.go:71] duration metric: took 19.241402ms to wait for apiserver process to appear ...
	I0921 15:27:06.778552   10408 api_server.go:87] waiting for apiserver healthz status ...
	I0921 15:27:06.778559   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:27:11.780440   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I0921 15:27:12.280518   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:27:14.000183   10408 api_server.go:266] https://192.168.64.28:8443/healthz returned 403:
	{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/healthz\"","reason":"Forbidden","details":{},"code":403}
	W0921 15:27:14.000198   10408 api_server.go:102] status: https://192.168.64.28:8443/healthz returned error 403:
	{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/healthz\"","reason":"Forbidden","details":{},"code":403}
	I0921 15:27:14.282668   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:27:14.289281   10408 api_server.go:266] https://192.168.64.28:8443/healthz returned 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/bootstrap-controller ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	W0921 15:27:14.289293   10408 api_server.go:102] status: https://192.168.64.28:8443/healthz returned error 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/bootstrap-controller ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	I0921 15:27:14.780762   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:27:14.786529   10408 api_server.go:266] https://192.168.64.28:8443/healthz returned 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/bootstrap-controller ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	W0921 15:27:14.786540   10408 api_server.go:102] status: https://192.168.64.28:8443/healthz returned error 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/bootstrap-controller ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	I0921 15:27:15.280930   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:27:15.288106   10408 api_server.go:266] https://192.168.64.28:8443/healthz returned 200:
	ok
	I0921 15:27:15.292969   10408 api_server.go:140] control plane version: v1.25.2
	I0921 15:27:15.292981   10408 api_server.go:130] duration metric: took 8.514415313s to wait for apiserver health ...
	I0921 15:27:15.292986   10408 cni.go:95] Creating CNI manager for ""
	I0921 15:27:15.292994   10408 cni.go:169] CNI unnecessary in this configuration, recommending no CNI
	I0921 15:27:15.293004   10408 system_pods.go:43] waiting for kube-system pods to appear ...
	I0921 15:27:15.298309   10408 system_pods.go:59] 6 kube-system pods found
	I0921 15:27:15.298324   10408 system_pods.go:61] "coredns-565d847f94-9wtnp" [eb8f3bae-6107-4a2b-ba32-d79405830bf0] Running
	I0921 15:27:15.298330   10408 system_pods.go:61] "etcd-pause-20220921152522-3535" [17c2d77b-b921-47a8-9a13-17620d5b88c8] Running / Ready:ContainersNotReady (containers with unready status: [etcd]) / ContainersReady:ContainersNotReady (containers with unready status: [etcd])
	I0921 15:27:15.298335   10408 system_pods.go:61] "kube-apiserver-pause-20220921152522-3535" [0e89e308-e699-430a-9feb-d0b972291f03] Running / Ready:ContainersNotReady (containers with unready status: [kube-apiserver]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-apiserver])
	I0921 15:27:15.298340   10408 system_pods.go:61] "kube-controller-manager-pause-20220921152522-3535" [1e9f7576-ef69-4d06-b19d-0cf5fb9d0471] Running / Ready:ContainersNotReady (containers with unready status: [kube-controller-manager]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-controller-manager])
	I0921 15:27:15.298344   10408 system_pods.go:61] "kube-proxy-5c7jc" [1c5b06ea-f4c2-45b9-a80e-d85983bb3282] Running
	I0921 15:27:15.298348   10408 system_pods.go:61] "kube-scheduler-pause-20220921152522-3535" [cb32a64b-32f0-46e6-8f1c-f2a3460c5fbb] Running / Ready:ContainersNotReady (containers with unready status: [kube-scheduler]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-scheduler])
	I0921 15:27:15.298352   10408 system_pods.go:74] duration metric: took 5.344262ms to wait for pod list to return data ...
	I0921 15:27:15.298357   10408 node_conditions.go:102] verifying NodePressure condition ...
	I0921 15:27:15.300304   10408 node_conditions.go:122] node storage ephemeral capacity is 17784752Ki
	I0921 15:27:15.300319   10408 node_conditions.go:123] node cpu capacity is 2
	I0921 15:27:15.300328   10408 node_conditions.go:105] duration metric: took 1.967816ms to run NodePressure ...
	I0921 15:27:15.300342   10408 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.2:$PATH" kubeadm init phase addon all --config /var/tmp/minikube/kubeadm.yaml"
	I0921 15:27:15.402185   10408 kubeadm.go:763] waiting for restarted kubelet to initialise ...
	I0921 15:27:15.405062   10408 kubeadm.go:778] kubelet initialised
	I0921 15:27:15.405072   10408 kubeadm.go:779] duration metric: took 2.873657ms waiting for restarted kubelet to initialise ...
	I0921 15:27:15.405080   10408 pod_ready.go:35] extra waiting up to 4m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0921 15:27:15.408132   10408 pod_ready.go:78] waiting up to 4m0s for pod "coredns-565d847f94-9wtnp" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:15.411452   10408 pod_ready.go:92] pod "coredns-565d847f94-9wtnp" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:15.411459   10408 pod_ready.go:81] duration metric: took 3.317632ms waiting for pod "coredns-565d847f94-9wtnp" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:15.411465   10408 pod_ready.go:78] waiting up to 4m0s for pod "etcd-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:17.420289   10408 pod_ready.go:102] pod "etcd-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:19.421503   10408 pod_ready.go:102] pod "etcd-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:21.919889   10408 pod_ready.go:102] pod "etcd-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:24.419226   10408 pod_ready.go:102] pod "etcd-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:25.920028   10408 pod_ready.go:92] pod "etcd-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:25.920043   10408 pod_ready.go:81] duration metric: took 10.508561161s waiting for pod "etcd-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:25.920049   10408 pod_ready.go:78] waiting up to 4m0s for pod "kube-apiserver-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:25.923063   10408 pod_ready.go:92] pod "kube-apiserver-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:25.923071   10408 pod_ready.go:81] duration metric: took 3.017613ms waiting for pod "kube-apiserver-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:25.923077   10408 pod_ready.go:78] waiting up to 4m0s for pod "kube-controller-manager-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:25.926284   10408 pod_ready.go:92] pod "kube-controller-manager-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:25.926292   10408 pod_ready.go:81] duration metric: took 3.20987ms waiting for pod "kube-controller-manager-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:25.926297   10408 pod_ready.go:78] waiting up to 4m0s for pod "kube-proxy-5c7jc" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:25.929448   10408 pod_ready.go:92] pod "kube-proxy-5c7jc" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:25.929456   10408 pod_ready.go:81] duration metric: took 3.154194ms waiting for pod "kube-proxy-5c7jc" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:25.929461   10408 pod_ready.go:78] waiting up to 4m0s for pod "kube-scheduler-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:25.932599   10408 pod_ready.go:92] pod "kube-scheduler-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:25.932606   10408 pod_ready.go:81] duration metric: took 3.140486ms waiting for pod "kube-scheduler-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:25.932610   10408 pod_ready.go:38] duration metric: took 10.527510396s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0921 15:27:25.932619   10408 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0921 15:27:25.939997   10408 ops.go:34] apiserver oom_adj: -16
	I0921 15:27:25.940008   10408 kubeadm.go:631] restartCluster took 55.116747244s
	I0921 15:27:25.940013   10408 kubeadm.go:398] StartCluster complete in 55.154103553s
	I0921 15:27:25.940027   10408 settings.go:142] acquiring lock: {Name:mkb00f1de0b91d8f67bd982eab088d27845674b9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0921 15:27:25.940102   10408 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/kubeconfig
	I0921 15:27:25.941204   10408 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/kubeconfig: {Name:mka2f83e1cbd4124ff7179732fbb172d977cf2f4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0921 15:27:25.942042   10408 kapi.go:59] client config for pause-20220921152522-3535: &rest.Config{Host:"https://192.168.64.28:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/client.crt", KeyFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/cl
ient.key", CAFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x233b400), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0921 15:27:25.944188   10408 kapi.go:244] deployment "coredns" in namespace "kube-system" and context "pause-20220921152522-3535" rescaled to 1
	I0921 15:27:25.944221   10408 start.go:211] Will wait 6m0s for node &{Name: IP:192.168.64.28 Port:8443 KubernetesVersion:v1.25.2 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0921 15:27:25.944255   10408 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.25.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I0921 15:27:25.944277   10408 addons.go:412] enableAddons start: toEnable=map[], additional=[]
	I0921 15:27:25.944378   10408 config.go:180] Loaded profile config "pause-20220921152522-3535": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.2
	I0921 15:27:25.967437   10408 addons.go:65] Setting storage-provisioner=true in profile "pause-20220921152522-3535"
	I0921 15:27:25.967440   10408 addons.go:65] Setting default-storageclass=true in profile "pause-20220921152522-3535"
	I0921 15:27:25.967359   10408 out.go:177] * Verifying Kubernetes components...
	I0921 15:27:25.967453   10408 addons.go:153] Setting addon storage-provisioner=true in "pause-20220921152522-3535"
	I0921 15:27:25.967457   10408 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "pause-20220921152522-3535"
	W0921 15:27:25.967460   10408 addons.go:162] addon storage-provisioner should already be in state true
	I0921 15:27:26.012377   10408 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0921 15:27:26.012436   10408 host.go:66] Checking if "pause-20220921152522-3535" exists ...
	I0921 15:27:26.012762   10408 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:27:26.012761   10408 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:27:26.012794   10408 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:27:26.012829   10408 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:27:26.019897   10408 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:53028
	I0921 15:27:26.020028   10408 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:53029
	I0921 15:27:26.020328   10408 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:27:26.020394   10408 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:27:26.020706   10408 main.go:134] libmachine: Using API Version  1
	I0921 15:27:26.020719   10408 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:27:26.020801   10408 main.go:134] libmachine: Using API Version  1
	I0921 15:27:26.020817   10408 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:27:26.020929   10408 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:27:26.021015   10408 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:27:26.021115   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetState
	I0921 15:27:26.021203   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0921 15:27:26.021283   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | hyperkit pid from json: 10295
	I0921 15:27:26.021419   10408 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:27:26.021443   10408 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:27:26.023750   10408 kapi.go:59] client config for pause-20220921152522-3535: &rest.Config{Host:"https://192.168.64.28:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/client.crt", KeyFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/cl
ient.key", CAFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x233b400), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0921 15:27:26.027574   10408 addons.go:153] Setting addon default-storageclass=true in "pause-20220921152522-3535"
	W0921 15:27:26.027587   10408 addons.go:162] addon default-storageclass should already be in state true
	I0921 15:27:26.027606   10408 host.go:66] Checking if "pause-20220921152522-3535" exists ...
	I0921 15:27:26.027788   10408 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:53032
	I0921 15:27:26.027854   10408 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:27:26.027880   10408 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:27:26.028560   10408 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:27:26.029753   10408 main.go:134] libmachine: Using API Version  1
	I0921 15:27:26.029767   10408 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:27:26.030003   10408 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:27:26.030113   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetState
	I0921 15:27:26.030207   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0921 15:27:26.030282   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | hyperkit pid from json: 10295
	I0921 15:27:26.031135   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:27:26.034331   10408 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:53034
	I0921 15:27:26.055199   10408 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I0921 15:27:26.038435   10408 node_ready.go:35] waiting up to 6m0s for node "pause-20220921152522-3535" to be "Ready" ...
	I0921 15:27:26.038466   10408 start.go:790] CoreDNS already contains "host.minikube.internal" host record, skipping...
	I0921 15:27:26.055642   10408 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:27:26.075151   10408 addons.go:345] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I0921 15:27:26.075161   10408 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I0921 15:27:26.075184   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:27:26.075306   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:27:26.075441   10408 main.go:134] libmachine: Using API Version  1
	I0921 15:27:26.075451   10408 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:27:26.075455   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:27:26.075546   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:27:26.075643   10408 sshutil.go:53] new ssh client: &{IP:192.168.64.28 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/pause-20220921152522-3535/id_rsa Username:docker}
	I0921 15:27:26.075669   10408 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:27:26.076075   10408 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:27:26.076097   10408 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:27:26.082485   10408 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:53037
	I0921 15:27:26.082858   10408 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:27:26.083217   10408 main.go:134] libmachine: Using API Version  1
	I0921 15:27:26.083234   10408 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:27:26.083443   10408 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:27:26.083534   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetState
	I0921 15:27:26.083608   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0921 15:27:26.083699   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | hyperkit pid from json: 10295
	I0921 15:27:26.084503   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:27:26.084648   10408 addons.go:345] installing /etc/kubernetes/addons/storageclass.yaml
	I0921 15:27:26.084657   10408 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I0921 15:27:26.084665   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:27:26.084734   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:27:26.084830   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:27:26.084916   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:27:26.085010   10408 sshutil.go:53] new ssh client: &{IP:192.168.64.28 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/pause-20220921152522-3535/id_rsa Username:docker}
	I0921 15:27:26.117393   10408 node_ready.go:49] node "pause-20220921152522-3535" has status "Ready":"True"
	I0921 15:27:26.117403   10408 node_ready.go:38] duration metric: took 42.373374ms waiting for node "pause-20220921152522-3535" to be "Ready" ...
	I0921 15:27:26.117410   10408 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0921 15:27:26.127239   10408 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.2/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I0921 15:27:26.137634   10408 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.2/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I0921 15:27:26.319821   10408 pod_ready.go:78] waiting up to 6m0s for pod "coredns-565d847f94-9wtnp" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:26.697611   10408 main.go:134] libmachine: Making call to close driver server
	I0921 15:27:26.697627   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .Close
	I0921 15:27:26.697784   10408 main.go:134] libmachine: Successfully made call to close driver server
	I0921 15:27:26.697793   10408 main.go:134] libmachine: Making call to close connection to plugin binary
	I0921 15:27:26.697804   10408 main.go:134] libmachine: Making call to close driver server
	I0921 15:27:26.697809   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .Close
	I0921 15:27:26.697836   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | Closing plugin on server side
	I0921 15:27:26.697938   10408 main.go:134] libmachine: Successfully made call to close driver server
	I0921 15:27:26.697946   10408 main.go:134] libmachine: Making call to close connection to plugin binary
	I0921 15:27:26.697962   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | Closing plugin on server side
	I0921 15:27:26.712622   10408 main.go:134] libmachine: Making call to close driver server
	I0921 15:27:26.712636   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .Close
	I0921 15:27:26.712825   10408 main.go:134] libmachine: Successfully made call to close driver server
	I0921 15:27:26.712834   10408 main.go:134] libmachine: Making call to close connection to plugin binary
	I0921 15:27:26.712839   10408 main.go:134] libmachine: Making call to close driver server
	I0921 15:27:26.712844   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | Closing plugin on server side
	I0921 15:27:26.712846   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .Close
	I0921 15:27:26.712954   10408 main.go:134] libmachine: Successfully made call to close driver server
	I0921 15:27:26.712962   10408 main.go:134] libmachine: Making call to close connection to plugin binary
	I0921 15:27:26.712969   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | Closing plugin on server side
	I0921 15:27:26.712973   10408 main.go:134] libmachine: Making call to close driver server
	I0921 15:27:26.712981   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .Close
	I0921 15:27:26.713114   10408 main.go:134] libmachine: Successfully made call to close driver server
	I0921 15:27:26.713128   10408 main.go:134] libmachine: Making call to close connection to plugin binary
	I0921 15:27:26.713142   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | Closing plugin on server side
	I0921 15:27:26.735926   10408 out.go:177] * Enabled addons: storage-provisioner, default-storageclass
	I0921 15:27:26.773142   10408 addons.go:414] enableAddons completed in 828.831417ms
	I0921 15:27:26.776027   10408 pod_ready.go:92] pod "coredns-565d847f94-9wtnp" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:26.776040   10408 pod_ready.go:81] duration metric: took 456.205251ms waiting for pod "coredns-565d847f94-9wtnp" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:26.776049   10408 pod_ready.go:78] waiting up to 6m0s for pod "etcd-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:27.117622   10408 pod_ready.go:92] pod "etcd-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:27.117632   10408 pod_ready.go:81] duration metric: took 341.577773ms waiting for pod "etcd-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:27.117638   10408 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:27.518637   10408 pod_ready.go:92] pod "kube-apiserver-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:27.518650   10408 pod_ready.go:81] duration metric: took 401.006674ms waiting for pod "kube-apiserver-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:27.518660   10408 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:27.918763   10408 pod_ready.go:92] pod "kube-controller-manager-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:27.918778   10408 pod_ready.go:81] duration metric: took 400.10892ms waiting for pod "kube-controller-manager-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:27.918787   10408 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-5c7jc" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:28.318657   10408 pod_ready.go:92] pod "kube-proxy-5c7jc" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:28.318670   10408 pod_ready.go:81] duration metric: took 399.877205ms waiting for pod "kube-proxy-5c7jc" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:28.318678   10408 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:28.720230   10408 pod_ready.go:92] pod "kube-scheduler-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:28.720243   10408 pod_ready.go:81] duration metric: took 401.55845ms waiting for pod "kube-scheduler-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:28.720250   10408 pod_ready.go:38] duration metric: took 2.602830576s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0921 15:27:28.720263   10408 api_server.go:51] waiting for apiserver process to appear ...
	I0921 15:27:28.720316   10408 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0921 15:27:28.729887   10408 api_server.go:71] duration metric: took 2.78564504s to wait for apiserver process to appear ...
	I0921 15:27:28.729899   10408 api_server.go:87] waiting for apiserver healthz status ...
	I0921 15:27:28.729905   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:27:28.733744   10408 api_server.go:266] https://192.168.64.28:8443/healthz returned 200:
	ok
	I0921 15:27:28.734313   10408 api_server.go:140] control plane version: v1.25.2
	I0921 15:27:28.734323   10408 api_server.go:130] duration metric: took 4.419338ms to wait for apiserver health ...
	I0921 15:27:28.734328   10408 system_pods.go:43] waiting for kube-system pods to appear ...
	I0921 15:27:28.920241   10408 system_pods.go:59] 7 kube-system pods found
	I0921 15:27:28.920257   10408 system_pods.go:61] "coredns-565d847f94-9wtnp" [eb8f3bae-6107-4a2b-ba32-d79405830bf0] Running
	I0921 15:27:28.920261   10408 system_pods.go:61] "etcd-pause-20220921152522-3535" [17c2d77b-b921-47a8-9a13-17620d5b88c8] Running
	I0921 15:27:28.920274   10408 system_pods.go:61] "kube-apiserver-pause-20220921152522-3535" [0e89e308-e699-430a-9feb-d0b972291f03] Running
	I0921 15:27:28.920279   10408 system_pods.go:61] "kube-controller-manager-pause-20220921152522-3535" [1e9f7576-ef69-4d06-b19d-0cf5fb9d0471] Running
	I0921 15:27:28.920283   10408 system_pods.go:61] "kube-proxy-5c7jc" [1c5b06ea-f4c2-45b9-a80e-d85983bb3282] Running
	I0921 15:27:28.920286   10408 system_pods.go:61] "kube-scheduler-pause-20220921152522-3535" [cb32a64b-32f0-46e6-8f1c-f2a3460c5fbb] Running
	I0921 15:27:28.920289   10408 system_pods.go:61] "storage-provisioner" [f71f00f0-f421-45c2-bfe4-c1e99f11b8e5] Running
	I0921 15:27:28.920294   10408 system_pods.go:74] duration metric: took 185.961163ms to wait for pod list to return data ...
	I0921 15:27:28.920300   10408 default_sa.go:34] waiting for default service account to be created ...
	I0921 15:27:29.119704   10408 default_sa.go:45] found service account: "default"
	I0921 15:27:29.119720   10408 default_sa.go:55] duration metric: took 199.41576ms for default service account to be created ...
	I0921 15:27:29.119727   10408 system_pods.go:116] waiting for k8s-apps to be running ...
	I0921 15:27:29.322362   10408 system_pods.go:86] 7 kube-system pods found
	I0921 15:27:29.322375   10408 system_pods.go:89] "coredns-565d847f94-9wtnp" [eb8f3bae-6107-4a2b-ba32-d79405830bf0] Running
	I0921 15:27:29.322379   10408 system_pods.go:89] "etcd-pause-20220921152522-3535" [17c2d77b-b921-47a8-9a13-17620d5b88c8] Running
	I0921 15:27:29.322383   10408 system_pods.go:89] "kube-apiserver-pause-20220921152522-3535" [0e89e308-e699-430a-9feb-d0b972291f03] Running
	I0921 15:27:29.322388   10408 system_pods.go:89] "kube-controller-manager-pause-20220921152522-3535" [1e9f7576-ef69-4d06-b19d-0cf5fb9d0471] Running
	I0921 15:27:29.322391   10408 system_pods.go:89] "kube-proxy-5c7jc" [1c5b06ea-f4c2-45b9-a80e-d85983bb3282] Running
	I0921 15:27:29.322395   10408 system_pods.go:89] "kube-scheduler-pause-20220921152522-3535" [cb32a64b-32f0-46e6-8f1c-f2a3460c5fbb] Running
	I0921 15:27:29.322398   10408 system_pods.go:89] "storage-provisioner" [f71f00f0-f421-45c2-bfe4-c1e99f11b8e5] Running
	I0921 15:27:29.322402   10408 system_pods.go:126] duration metric: took 202.671392ms to wait for k8s-apps to be running ...
	I0921 15:27:29.322407   10408 system_svc.go:44] waiting for kubelet service to be running ....
	I0921 15:27:29.322452   10408 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0921 15:27:29.331792   10408 system_svc.go:56] duration metric: took 9.381149ms WaitForService to wait for kubelet.
	I0921 15:27:29.331804   10408 kubeadm.go:573] duration metric: took 3.387565971s to wait for : map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] ...
	I0921 15:27:29.331823   10408 node_conditions.go:102] verifying NodePressure condition ...
	I0921 15:27:29.518084   10408 node_conditions.go:122] node storage ephemeral capacity is 17784752Ki
	I0921 15:27:29.518100   10408 node_conditions.go:123] node cpu capacity is 2
	I0921 15:27:29.518105   10408 node_conditions.go:105] duration metric: took 186.278888ms to run NodePressure ...
	I0921 15:27:29.518113   10408 start.go:216] waiting for startup goroutines ...
	I0921 15:27:29.551427   10408 start.go:506] kubectl: 1.25.0, cluster: 1.25.2 (minor skew: 0)
	I0921 15:27:29.611327   10408 out.go:177] * Done! kubectl is now configured to use "pause-20220921152522-3535" cluster and "default" namespace by default

                                                
                                                
** /stderr **
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p pause-20220921152522-3535 -n pause-20220921152522-3535
helpers_test.go:244: <<< TestPause/serial/SecondStartNoReconfiguration FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestPause/serial/SecondStartNoReconfiguration]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-darwin-amd64 -p pause-20220921152522-3535 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-darwin-amd64 -p pause-20220921152522-3535 logs -n 25: (2.727630141s)
helpers_test.go:252: TestPause/serial/SecondStartNoReconfiguration logs: 
-- stdout --
	* 
	* ==> Audit <==
	* |---------|----------------------------------------|----------------------------------------|---------|---------|---------------------|---------------------|
	| Command |                  Args                  |                Profile                 |  User   | Version |     Start Time      |      End Time       |
	|---------|----------------------------------------|----------------------------------------|---------|---------|---------------------|---------------------|
	| start   | -p                                     | kubernetes-upgrade-20220921151918-3535 | jenkins | v1.27.0 | 21 Sep 22 15:20 PDT | 21 Sep 22 15:21 PDT |
	|         | kubernetes-upgrade-20220921151918-3535 |                                        |         |         |                     |                     |
	|         | --memory=2200                          |                                        |         |         |                     |                     |
	|         | --kubernetes-version=v1.25.2           |                                        |         |         |                     |                     |
	|         | --alsologtostderr -v=1                 |                                        |         |         |                     |                     |
	|         | --driver=hyperkit                      |                                        |         |         |                     |                     |
	| start   | -p                                     | kubernetes-upgrade-20220921151918-3535 | jenkins | v1.27.0 | 21 Sep 22 15:21 PDT |                     |
	|         | kubernetes-upgrade-20220921151918-3535 |                                        |         |         |                     |                     |
	|         | --memory=2200                          |                                        |         |         |                     |                     |
	|         | --kubernetes-version=v1.16.0           |                                        |         |         |                     |                     |
	|         | --driver=hyperkit                      |                                        |         |         |                     |                     |
	| start   | -p                                     | kubernetes-upgrade-20220921151918-3535 | jenkins | v1.27.0 | 21 Sep 22 15:21 PDT | 21 Sep 22 15:21 PDT |
	|         | kubernetes-upgrade-20220921151918-3535 |                                        |         |         |                     |                     |
	|         | --memory=2200                          |                                        |         |         |                     |                     |
	|         | --kubernetes-version=v1.25.2           |                                        |         |         |                     |                     |
	|         | --alsologtostderr -v=1                 |                                        |         |         |                     |                     |
	|         | --driver=hyperkit                      |                                        |         |         |                     |                     |
	| delete  | -p                                     | kubernetes-upgrade-20220921151918-3535 | jenkins | v1.27.0 | 21 Sep 22 15:21 PDT | 21 Sep 22 15:21 PDT |
	|         | kubernetes-upgrade-20220921151918-3535 |                                        |         |         |                     |                     |
	| start   | -p                                     | cert-expiration-20220921151821-3535    | jenkins | v1.27.0 | 21 Sep 22 15:22 PDT | 21 Sep 22 15:22 PDT |
	|         | cert-expiration-20220921151821-3535    |                                        |         |         |                     |                     |
	|         | --memory=2048                          |                                        |         |         |                     |                     |
	|         | --cert-expiration=8760h                |                                        |         |         |                     |                     |
	|         | --driver=hyperkit                      |                                        |         |         |                     |                     |
	| delete  | -p                                     | cert-expiration-20220921151821-3535    | jenkins | v1.27.0 | 21 Sep 22 15:22 PDT | 21 Sep 22 15:22 PDT |
	|         | cert-expiration-20220921151821-3535    |                                        |         |         |                     |                     |
	| start   | -p                                     | stopped-upgrade-20220921152137-3535    | jenkins | v1.27.0 | 21 Sep 22 15:23 PDT | 21 Sep 22 15:24 PDT |
	|         | stopped-upgrade-20220921152137-3535    |                                        |         |         |                     |                     |
	|         | --memory=2200 --alsologtostderr        |                                        |         |         |                     |                     |
	|         | -v=1 --driver=hyperkit                 |                                        |         |         |                     |                     |
	| start   | -p                                     | running-upgrade-20220921152233-3535    | jenkins | v1.27.0 | 21 Sep 22 15:24 PDT | 21 Sep 22 15:25 PDT |
	|         | running-upgrade-20220921152233-3535    |                                        |         |         |                     |                     |
	|         | --memory=2200 --alsologtostderr        |                                        |         |         |                     |                     |
	|         | -v=1 --driver=hyperkit                 |                                        |         |         |                     |                     |
	| delete  | -p                                     | stopped-upgrade-20220921152137-3535    | jenkins | v1.27.0 | 21 Sep 22 15:24 PDT | 21 Sep 22 15:24 PDT |
	|         | stopped-upgrade-20220921152137-3535    |                                        |         |         |                     |                     |
	| start   | -p                                     | NoKubernetes-20220921152435-3535       | jenkins | v1.27.0 | 21 Sep 22 15:24 PDT |                     |
	|         | NoKubernetes-20220921152435-3535       |                                        |         |         |                     |                     |
	|         | --no-kubernetes                        |                                        |         |         |                     |                     |
	|         | --kubernetes-version=1.20              |                                        |         |         |                     |                     |
	|         | --driver=hyperkit                      |                                        |         |         |                     |                     |
	| start   | -p                                     | NoKubernetes-20220921152435-3535       | jenkins | v1.27.0 | 21 Sep 22 15:24 PDT | 21 Sep 22 15:25 PDT |
	|         | NoKubernetes-20220921152435-3535       |                                        |         |         |                     |                     |
	|         | --driver=hyperkit                      |                                        |         |         |                     |                     |
	| delete  | -p                                     | running-upgrade-20220921152233-3535    | jenkins | v1.27.0 | 21 Sep 22 15:25 PDT | 21 Sep 22 15:25 PDT |
	|         | running-upgrade-20220921152233-3535    |                                        |         |         |                     |                     |
	| start   | -p                                     | NoKubernetes-20220921152435-3535       | jenkins | v1.27.0 | 21 Sep 22 15:25 PDT | 21 Sep 22 15:25 PDT |
	|         | NoKubernetes-20220921152435-3535       |                                        |         |         |                     |                     |
	|         | --no-kubernetes                        |                                        |         |         |                     |                     |
	|         | --driver=hyperkit                      |                                        |         |         |                     |                     |
	| start   | -p pause-20220921152522-3535           | pause-20220921152522-3535              | jenkins | v1.27.0 | 21 Sep 22 15:25 PDT | 21 Sep 22 15:26 PDT |
	|         | --memory=2048                          |                                        |         |         |                     |                     |
	|         | --install-addons=false                 |                                        |         |         |                     |                     |
	|         | --wait=all --driver=hyperkit           |                                        |         |         |                     |                     |
	| delete  | -p                                     | NoKubernetes-20220921152435-3535       | jenkins | v1.27.0 | 21 Sep 22 15:25 PDT | 21 Sep 22 15:25 PDT |
	|         | NoKubernetes-20220921152435-3535       |                                        |         |         |                     |                     |
	| start   | -p                                     | NoKubernetes-20220921152435-3535       | jenkins | v1.27.0 | 21 Sep 22 15:25 PDT | 21 Sep 22 15:25 PDT |
	|         | NoKubernetes-20220921152435-3535       |                                        |         |         |                     |                     |
	|         | --no-kubernetes                        |                                        |         |         |                     |                     |
	|         | --driver=hyperkit                      |                                        |         |         |                     |                     |
	| ssh     | -p                                     | NoKubernetes-20220921152435-3535       | jenkins | v1.27.0 | 21 Sep 22 15:25 PDT |                     |
	|         | NoKubernetes-20220921152435-3535       |                                        |         |         |                     |                     |
	|         | sudo systemctl is-active --quiet       |                                        |         |         |                     |                     |
	|         | service kubelet                        |                                        |         |         |                     |                     |
	| profile | list                                   | minikube                               | jenkins | v1.27.0 | 21 Sep 22 15:25 PDT | 21 Sep 22 15:25 PDT |
	| profile | list --output=json                     | minikube                               | jenkins | v1.27.0 | 21 Sep 22 15:25 PDT | 21 Sep 22 15:25 PDT |
	| stop    | -p                                     | NoKubernetes-20220921152435-3535       | jenkins | v1.27.0 | 21 Sep 22 15:25 PDT | 21 Sep 22 15:25 PDT |
	|         | NoKubernetes-20220921152435-3535       |                                        |         |         |                     |                     |
	| start   | -p                                     | NoKubernetes-20220921152435-3535       | jenkins | v1.27.0 | 21 Sep 22 15:25 PDT | 21 Sep 22 15:26 PDT |
	|         | NoKubernetes-20220921152435-3535       |                                        |         |         |                     |                     |
	|         | --driver=hyperkit                      |                                        |         |         |                     |                     |
	| ssh     | -p                                     | NoKubernetes-20220921152435-3535       | jenkins | v1.27.0 | 21 Sep 22 15:26 PDT |                     |
	|         | NoKubernetes-20220921152435-3535       |                                        |         |         |                     |                     |
	|         | sudo systemctl is-active --quiet       |                                        |         |         |                     |                     |
	|         | service kubelet                        |                                        |         |         |                     |                     |
	| delete  | -p                                     | NoKubernetes-20220921152435-3535       | jenkins | v1.27.0 | 21 Sep 22 15:26 PDT | 21 Sep 22 15:26 PDT |
	|         | NoKubernetes-20220921152435-3535       |                                        |         |         |                     |                     |
	| start   | -p false-20220921151637-3535           | false-20220921151637-3535              | jenkins | v1.27.0 | 21 Sep 22 15:26 PDT |                     |
	|         | --memory=2048                          |                                        |         |         |                     |                     |
	|         | --alsologtostderr --wait=true          |                                        |         |         |                     |                     |
	|         | --wait-timeout=5m --cni=false          |                                        |         |         |                     |                     |
	|         | --driver=hyperkit                      |                                        |         |         |                     |                     |
	| start   | -p pause-20220921152522-3535           | pause-20220921152522-3535              | jenkins | v1.27.0 | 21 Sep 22 15:26 PDT | 21 Sep 22 15:27 PDT |
	|         | --alsologtostderr -v=1                 |                                        |         |         |                     |                     |
	|         | --driver=hyperkit                      |                                        |         |         |                     |                     |
	|---------|----------------------------------------|----------------------------------------|---------|---------|---------------------|---------------------|
	
	* 
	* ==> Last Start <==
	* Log file created at: 2022/09/21 15:26:16
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.19.1 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0921 15:26:16.412297   10408 out.go:296] Setting OutFile to fd 1 ...
	I0921 15:26:16.412857   10408 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0921 15:26:16.412883   10408 out.go:309] Setting ErrFile to fd 2...
	I0921 15:26:16.412925   10408 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0921 15:26:16.413172   10408 root.go:333] Updating PATH: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/bin
	I0921 15:26:16.413935   10408 out.go:303] Setting JSON to false
	I0921 15:26:16.429337   10408 start.go:115] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":5147,"bootTime":1663794029,"procs":382,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"12.6","kernelVersion":"21.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0921 15:26:16.429439   10408 start.go:123] gopshost.Virtualization returned error: not implemented yet
	I0921 15:26:16.451061   10408 out.go:177] * [pause-20220921152522-3535] minikube v1.27.0 on Darwin 12.6
	I0921 15:26:16.492895   10408 notify.go:214] Checking for updates...
	I0921 15:26:16.513942   10408 out.go:177]   - MINIKUBE_LOCATION=14995
	I0921 15:26:16.535147   10408 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/kubeconfig
	I0921 15:26:16.555899   10408 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0921 15:26:16.577004   10408 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0921 15:26:16.598036   10408 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube
	I0921 15:26:16.619232   10408 config.go:180] Loaded profile config "pause-20220921152522-3535": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.2
	I0921 15:26:16.619572   10408 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:26:16.619620   10408 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:26:16.626042   10408 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:52950
	I0921 15:26:16.626541   10408 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:26:16.626992   10408 main.go:134] libmachine: Using API Version  1
	I0921 15:26:16.627004   10408 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:26:16.627211   10408 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:26:16.627372   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:16.627501   10408 driver.go:365] Setting default libvirt URI to qemu:///system
	I0921 15:26:16.627783   10408 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:26:16.627806   10408 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:26:16.634000   10408 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:52952
	I0921 15:26:16.634367   10408 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:26:16.634679   10408 main.go:134] libmachine: Using API Version  1
	I0921 15:26:16.634691   10408 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:26:16.634960   10408 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:26:16.635067   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:16.661930   10408 out.go:177] * Using the hyperkit driver based on existing profile
	I0921 15:26:16.703890   10408 start.go:284] selected driver: hyperkit
	I0921 15:26:16.703910   10408 start.go:808] validating driver "hyperkit" against &{Name:pause-20220921152522-3535 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube/iso/minikube-v1.27.0-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase:v0.0.34@sha256:f2a1e577e43fd6769f35cdb938f6d21c3dacfd763062d119cade738fa244720c Memory:2048 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.25.2 ClusterNam
e:pause-20220921152522-3535 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP:192.168.64.28 Port:8443 KubernetesVersion:v1.25.2 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort
:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath:/opt/socket_vmnet/bin/socket_vmnet_client SocketVMnetPath:/var/run/socket_vmnet}
	I0921 15:26:16.704025   10408 start.go:819] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0921 15:26:16.704092   10408 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0921 15:26:16.704203   10408 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0921 15:26:16.710571   10408 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.27.0
	I0921 15:26:16.713621   10408 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:26:16.713649   10408 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0921 15:26:16.715630   10408 cni.go:95] Creating CNI manager for ""
	I0921 15:26:16.715647   10408 cni.go:169] CNI unnecessary in this configuration, recommending no CNI
	I0921 15:26:16.715664   10408 start_flags.go:316] config:
	{Name:pause-20220921152522-3535 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube/iso/minikube-v1.27.0-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase:v0.0.34@sha256:f2a1e577e43fd6769f35cdb938f6d21c3dacfd763062d119cade738fa244720c Memory:2048 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.25.2 ClusterName:pause-20220921152522-3535 Namespace:default APIServerName:minikubeCA APIServerNa
mes:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP:192.168.64.28 Port:8443 KubernetesVersion:v1.25.2 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMe
trics:false CustomQemuFirmwarePath: SocketVMnetClientPath:/opt/socket_vmnet/bin/socket_vmnet_client SocketVMnetPath:/var/run/socket_vmnet}
	I0921 15:26:16.715818   10408 iso.go:124] acquiring lock: {Name:mke8c57399926d29e846b47dd4be4625ba5fcaea Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0921 15:26:16.774023   10408 out.go:177] * Starting control plane node pause-20220921152522-3535 in cluster pause-20220921152522-3535
	I0921 15:26:14.112290   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | 2022/09/21 15:26:14 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0921 15:26:14.112374   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | 2022/09/21 15:26:14 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0921 15:26:14.112386   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | 2022/09/21 15:26:14 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0921 15:26:15.320346   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | Attempt 3
	I0921 15:26:15.320365   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0921 15:26:15.320474   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | hyperkit pid from json: 10400
	I0921 15:26:15.321107   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | Searching for 36:15:df:cc:5b:5b in /var/db/dhcpd_leases ...
	I0921 15:26:15.321174   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | Found 28 entries in /var/db/dhcpd_leases!
	I0921 15:26:15.321185   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.29 HWAddress:3e:7a:92:24:5:ce ID:1,3e:7a:92:24:5:ce Lease:0x632b8f7f}
	I0921 15:26:15.321194   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.28 HWAddress:c2:90:21:6e:75:6 ID:1,c2:90:21:6e:75:6 Lease:0x632ce0da}
	I0921 15:26:15.321202   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.27 HWAddress:9e:f3:b1:1c:9b:1c ID:1,9e:f3:b1:1c:9b:1c Lease:0x632b8f54}
	I0921 15:26:15.321211   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.26 HWAddress:66:c5:83:6d:55:91 ID:1,66:c5:83:6d:55:91 Lease:0x632ce03b}
	I0921 15:26:15.321220   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.25 HWAddress:ea:9c:f4:77:1d:3d ID:1,ea:9c:f4:77:1d:3d Lease:0x632ce076}
	I0921 15:26:15.321227   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.24 HWAddress:36:e:45:14:25:55 ID:1,36:e:45:14:25:55 Lease:0x632cdfb6}
	I0921 15:26:15.321236   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.23 HWAddress:92:2e:30:54:49:f3 ID:1,92:2e:30:54:49:f3 Lease:0x632b8de5}
	I0921 15:26:15.321243   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.22 HWAddress:1a:83:83:3:65:1a ID:1,1a:83:83:3:65:1a Lease:0x632cdf36}
	I0921 15:26:15.321252   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.21 HWAddress:b6:1a:2d:8:65:c5 ID:1,b6:1a:2d:8:65:c5 Lease:0x632cdf16}
	I0921 15:26:15.321259   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.20 HWAddress:72:4c:c8:cf:4f:63 ID:1,72:4c:c8:cf:4f:63 Lease:0x632b8dac}
	I0921 15:26:15.321274   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.19 HWAddress:c2:f8:ac:87:d9:f0 ID:1,c2:f8:ac:87:d9:f0 Lease:0x632b8d80}
	I0921 15:26:15.321291   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:62:35:c1:26:64:c0 ID:1,62:35:c1:26:64:c0 Lease:0x632b8d81}
	I0921 15:26:15.321303   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:96:24:b5:8e:13:fc ID:1,96:24:b5:8e:13:fc Lease:0x632cde86}
	I0921 15:26:15.321315   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:e:f1:67:89:3f:e3 ID:1,e:f1:67:89:3f:e3 Lease:0x632cde14}
	I0921 15:26:15.321324   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:a2:3d:49:78:3b:4c ID:1,a2:3d:49:78:3b:4c Lease:0x632cdd68}
	I0921 15:26:15.321339   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:1a:dd:bc:c:73:c4 ID:1,1a:dd:bc:c:73:c4 Lease:0x632cdd35}
	I0921 15:26:15.321350   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:52:e5:24:3b:ab:4 ID:1,52:e5:24:3b:ab:4 Lease:0x632b897b}
	I0921 15:26:15.321358   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:be:b4:fe:f4:b1:24 ID:1,be:b4:fe:f4:b1:24 Lease:0x632b8bde}
	I0921 15:26:15.321365   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:8a:c8:9b:80:80:10 ID:1,8a:c8:9b:80:80:10 Lease:0x632b8bdc}
	I0921 15:26:15.321376   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:12:72:ad:9f:f1:8f ID:1,12:72:ad:9f:f1:8f Lease:0x632b8511}
	I0921 15:26:15.321387   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:4a:58:20:58:21:84 ID:1,4a:58:20:58:21:84 Lease:0x632b84fc}
	I0921 15:26:15.321395   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:4e:eb:64:20:d8:40 ID:1,4e:eb:64:20:d8:40 Lease:0x632b84d4}
	I0921 15:26:15.321404   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:96:cb:c8:56:48:73 ID:1,96:cb:c8:56:48:73 Lease:0x632cd609}
	I0921 15:26:15.321411   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:3e:60:ad:7c:55:a0 ID:1,3e:60:ad:7c:55:a0 Lease:0x632cd5c9}
	I0921 15:26:15.321418   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:2:7a:1a:6a:a6:1f ID:1,2:7a:1a:6a:a6:1f Lease:0x632b843f}
	I0921 15:26:15.321426   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:9a:e7:f8:d0:27:5a ID:1,9a:e7:f8:d0:27:5a Lease:0x632cd449}
	I0921 15:26:15.321434   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:12:80:14:fc:de:ba ID:1,12:80:14:fc:de:ba Lease:0x632b82be}
	I0921 15:26:15.321440   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:56:cf:47:52:47:7e ID:1,56:cf:47:52:47:7e Lease:0x632b8281}
	I0921 15:26:17.321647   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | Attempt 4
	I0921 15:26:17.321668   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0921 15:26:17.321761   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | hyperkit pid from json: 10400
	I0921 15:26:17.322288   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | Searching for 36:15:df:cc:5b:5b in /var/db/dhcpd_leases ...
	I0921 15:26:17.322356   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | Found 29 entries in /var/db/dhcpd_leases!
	I0921 15:26:17.322367   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.30 HWAddress:36:15:df:cc:5b:5b ID:1,36:15:df:cc:5b:5b Lease:0x632ce108}
	I0921 15:26:17.322380   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | Found match: 36:15:df:cc:5b:5b
	I0921 15:26:17.322390   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | IP: 192.168.64.30
	I0921 15:26:17.322428   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetConfigRaw
	I0921 15:26:17.322951   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .DriverName
	I0921 15:26:17.323049   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .DriverName
	I0921 15:26:17.323142   10389 main.go:134] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0921 15:26:17.323154   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetState
	I0921 15:26:17.323221   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0921 15:26:17.323276   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | hyperkit pid from json: 10400
	I0921 15:26:17.323815   10389 main.go:134] libmachine: Detecting operating system of created instance...
	I0921 15:26:17.323821   10389 main.go:134] libmachine: Waiting for SSH to be available...
	I0921 15:26:17.323832   10389 main.go:134] libmachine: Getting to WaitForSSH function...
	I0921 15:26:17.323840   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHHostname
	I0921 15:26:17.323909   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHPort
	I0921 15:26:17.323997   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:17.324070   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:17.324148   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHUsername
	I0921 15:26:17.324242   10389 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:17.324383   10389 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.30 22 <nil> <nil>}
	I0921 15:26:17.324389   10389 main.go:134] libmachine: About to run SSH command:
	exit 0
	I0921 15:26:16.794876   10408 preload.go:132] Checking if preload exists for k8s version v1.25.2 and runtime docker
	I0921 15:26:16.794956   10408 preload.go:148] Found local preload: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.25.2-docker-overlay2-amd64.tar.lz4
	I0921 15:26:16.795012   10408 cache.go:57] Caching tarball of preloaded images
	I0921 15:26:16.795122   10408 preload.go:174] Found /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.25.2-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0921 15:26:16.795144   10408 cache.go:60] Finished verifying existence of preloaded tar for  v1.25.2 on docker
	I0921 15:26:16.795239   10408 profile.go:148] Saving config to /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/config.json ...
	I0921 15:26:16.795594   10408 cache.go:208] Successfully downloaded all kic artifacts
	I0921 15:26:16.795620   10408 start.go:364] acquiring machines lock for pause-20220921152522-3535: {Name:mk2f7774d81f069136708da9f7558413d7930511 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0921 15:26:19.803647   10408 start.go:368] acquired machines lock for "pause-20220921152522-3535" in 3.008011859s
	I0921 15:26:19.803693   10408 start.go:96] Skipping create...Using existing machine configuration
	I0921 15:26:19.803704   10408 fix.go:55] fixHost starting: 
	I0921 15:26:19.804014   10408 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:26:19.804040   10408 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:26:19.810489   10408 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:52975
	I0921 15:26:19.810845   10408 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:26:19.811156   10408 main.go:134] libmachine: Using API Version  1
	I0921 15:26:19.811167   10408 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:26:19.811357   10408 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:26:19.811458   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:19.811557   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetState
	I0921 15:26:19.811664   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0921 15:26:19.811739   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | hyperkit pid from json: 10295
	I0921 15:26:19.812542   10408 fix.go:103] recreateIfNeeded on pause-20220921152522-3535: state=Running err=<nil>
	W0921 15:26:19.812564   10408 fix.go:129] unexpected machine state, will restart: <nil>
	I0921 15:26:19.835428   10408 out.go:177] * Updating the running hyperkit "pause-20220921152522-3535" VM ...
	I0921 15:26:19.856170   10408 machine.go:88] provisioning docker machine ...
	I0921 15:26:19.856192   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:19.856377   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetMachineName
	I0921 15:26:19.856478   10408 buildroot.go:166] provisioning hostname "pause-20220921152522-3535"
	I0921 15:26:19.856489   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetMachineName
	I0921 15:26:19.856574   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:19.856646   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:26:19.856744   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.856835   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.856914   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:26:19.857028   10408 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:19.857193   10408 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.28 22 <nil> <nil>}
	I0921 15:26:19.857203   10408 main.go:134] libmachine: About to run SSH command:
	sudo hostname pause-20220921152522-3535 && echo "pause-20220921152522-3535" | sudo tee /etc/hostname
	I0921 15:26:19.929633   10408 main.go:134] libmachine: SSH cmd err, output: <nil>: pause-20220921152522-3535
	
	I0921 15:26:19.929693   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:19.929883   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:26:19.930020   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.930143   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.930253   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:26:19.930438   10408 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:19.930577   10408 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.28 22 <nil> <nil>}
	I0921 15:26:19.930595   10408 main.go:134] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\spause-20220921152522-3535' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 pause-20220921152522-3535/g' /etc/hosts;
				else 
					echo '127.0.1.1 pause-20220921152522-3535' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0921 15:26:19.992780   10408 main.go:134] libmachine: SSH cmd err, output: <nil>: 
	I0921 15:26:19.992803   10408 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube CaCertPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/key.pem ServerCertRemotePath
:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube}
	I0921 15:26:19.992832   10408 buildroot.go:174] setting up certificates
	I0921 15:26:19.992843   10408 provision.go:83] configureAuth start
	I0921 15:26:19.992852   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetMachineName
	I0921 15:26:19.993017   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetIP
	I0921 15:26:19.993132   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:19.993213   10408 provision.go:138] copyHostCerts
	I0921 15:26:19.993302   10408 exec_runner.go:144] found /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/cert.pem, removing ...
	I0921 15:26:19.993310   10408 exec_runner.go:207] rm: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/cert.pem
	I0921 15:26:19.993450   10408 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/cert.pem (1123 bytes)
	I0921 15:26:19.993643   10408 exec_runner.go:144] found /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/key.pem, removing ...
	I0921 15:26:19.993649   10408 exec_runner.go:207] rm: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/key.pem
	I0921 15:26:19.993780   10408 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/key.pem (1679 bytes)
	I0921 15:26:19.994087   10408 exec_runner.go:144] found /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.pem, removing ...
	I0921 15:26:19.994094   10408 exec_runner.go:207] rm: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.pem
	I0921 15:26:19.994203   10408 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.pem (1078 bytes)
	I0921 15:26:19.994341   10408 provision.go:112] generating server cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca-key.pem org=jenkins.pause-20220921152522-3535 san=[192.168.64.28 192.168.64.28 localhost 127.0.0.1 minikube pause-20220921152522-3535]
	I0921 15:26:20.145157   10408 provision.go:172] copyRemoteCerts
	I0921 15:26:20.145229   10408 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0921 15:26:20.145247   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:20.145395   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:26:20.145492   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.145591   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:26:20.145687   10408 sshutil.go:53] new ssh client: &{IP:192.168.64.28 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/pause-20220921152522-3535/id_rsa Username:docker}
	I0921 15:26:20.181860   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0921 15:26:20.204288   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/server.pem --> /etc/docker/server.pem (1249 bytes)
	I0921 15:26:20.223046   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0921 15:26:20.242859   10408 provision.go:86] duration metric: configureAuth took 250.000259ms
	I0921 15:26:20.242872   10408 buildroot.go:189] setting minikube options for container-runtime
	I0921 15:26:20.243031   10408 config.go:180] Loaded profile config "pause-20220921152522-3535": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.2
	I0921 15:26:20.243050   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:20.243218   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:20.243320   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:26:20.243440   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.243555   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.243661   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:26:20.243798   10408 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:20.243914   10408 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.28 22 <nil> <nil>}
	I0921 15:26:20.243922   10408 main.go:134] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0921 15:26:20.307004   10408 main.go:134] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0921 15:26:20.307030   10408 buildroot.go:70] root file system type: tmpfs
	I0921 15:26:20.307188   10408 provision.go:309] Updating docker unit: /lib/systemd/system/docker.service ...
	I0921 15:26:20.307206   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:20.307379   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:26:20.307501   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.307587   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.307679   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:26:20.307823   10408 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:20.307954   10408 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.28 22 <nil> <nil>}
	I0921 15:26:20.308011   10408 main.go:134] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0921 15:26:20.380017   10408 main.go:134] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0921 15:26:20.380044   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:20.380193   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:26:20.380302   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.380410   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.380514   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:26:20.380665   10408 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:20.380781   10408 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.28 22 <nil> <nil>}
	I0921 15:26:20.380797   10408 main.go:134] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0921 15:26:20.447616   10408 main.go:134] libmachine: SSH cmd err, output: <nil>: 
	I0921 15:26:20.447629   10408 machine.go:91] provisioned docker machine in 591.445478ms
	I0921 15:26:20.447641   10408 start.go:300] post-start starting for "pause-20220921152522-3535" (driver="hyperkit")
	I0921 15:26:20.447646   10408 start.go:328] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0921 15:26:20.447659   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:20.447885   10408 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0921 15:26:20.447901   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:20.448051   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:26:20.448156   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.448291   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:26:20.448405   10408 sshutil.go:53] new ssh client: &{IP:192.168.64.28 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/pause-20220921152522-3535/id_rsa Username:docker}
	I0921 15:26:20.484862   10408 ssh_runner.go:195] Run: cat /etc/os-release
	I0921 15:26:20.487726   10408 info.go:137] Remote host: Buildroot 2021.02.12
	I0921 15:26:20.487742   10408 filesync.go:126] Scanning /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/addons for local assets ...
	I0921 15:26:20.487867   10408 filesync.go:126] Scanning /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files for local assets ...
	I0921 15:26:20.488046   10408 filesync.go:149] local asset: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files/etc/ssl/certs/35352.pem -> 35352.pem in /etc/ssl/certs
	I0921 15:26:20.488202   10408 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0921 15:26:20.495074   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files/etc/ssl/certs/35352.pem --> /etc/ssl/certs/35352.pem (1708 bytes)
	I0921 15:26:20.515167   10408 start.go:303] post-start completed in 67.502258ms
	I0921 15:26:20.515187   10408 fix.go:57] fixHost completed within 711.484594ms
	I0921 15:26:20.515203   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:20.515368   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:26:20.515520   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.515638   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.515770   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:26:20.515941   10408 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:20.516053   10408 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.28 22 <nil> <nil>}
	I0921 15:26:20.516063   10408 main.go:134] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0921 15:26:20.577712   10408 main.go:134] libmachine: SSH cmd err, output: <nil>: 1663799180.686854068
	
	I0921 15:26:20.577735   10408 fix.go:207] guest clock: 1663799180.686854068
	I0921 15:26:20.577746   10408 fix.go:220] Guest: 2022-09-21 15:26:20.686854068 -0700 PDT Remote: 2022-09-21 15:26:20.51519 -0700 PDT m=+4.146234536 (delta=171.664068ms)
	I0921 15:26:20.577765   10408 fix.go:191] guest clock delta is within tolerance: 171.664068ms
	I0921 15:26:20.577770   10408 start.go:83] releasing machines lock for "pause-20220921152522-3535", held for 774.111447ms
	I0921 15:26:20.577789   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:20.577928   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetIP
	I0921 15:26:20.578042   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:20.578174   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:20.578318   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:20.578705   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:20.578809   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:20.578906   10408 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0921 15:26:20.578961   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:20.578984   10408 ssh_runner.go:195] Run: systemctl --version
	I0921 15:26:20.578999   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:20.579066   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:26:20.579106   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:26:20.579182   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.579228   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.579290   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:26:20.579338   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:26:20.579415   10408 sshutil.go:53] new ssh client: &{IP:192.168.64.28 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/pause-20220921152522-3535/id_rsa Username:docker}
	I0921 15:26:20.579448   10408 sshutil.go:53] new ssh client: &{IP:192.168.64.28 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/pause-20220921152522-3535/id_rsa Username:docker}
	I0921 15:26:20.650058   10408 preload.go:132] Checking if preload exists for k8s version v1.25.2 and runtime docker
	I0921 15:26:20.650150   10408 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0921 15:26:20.668593   10408 docker.go:611] Got preloaded images: -- stdout --
	registry.k8s.io/kube-apiserver:v1.25.2
	registry.k8s.io/kube-controller-manager:v1.25.2
	registry.k8s.io/kube-scheduler:v1.25.2
	registry.k8s.io/kube-proxy:v1.25.2
	registry.k8s.io/pause:3.8
	registry.k8s.io/etcd:3.5.4-0
	registry.k8s.io/coredns/coredns:v1.9.3
	k8s.gcr.io/pause:3.6
	gcr.io/k8s-minikube/storage-provisioner:v5
	
	-- /stdout --
	I0921 15:26:20.668610   10408 docker.go:542] Images already preloaded, skipping extraction
	I0921 15:26:20.668676   10408 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0921 15:26:20.679656   10408 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0921 15:26:20.692651   10408 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0921 15:26:20.702013   10408 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	image-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0921 15:26:20.715942   10408 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0921 15:26:20.844184   10408 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0921 15:26:20.974988   10408 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0921 15:26:21.117162   10408 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0921 15:26:18.404949   10389 main.go:134] libmachine: SSH cmd err, output: <nil>: 
	I0921 15:26:18.404961   10389 main.go:134] libmachine: Detecting the provisioner...
	I0921 15:26:18.404967   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHHostname
	I0921 15:26:18.405102   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHPort
	I0921 15:26:18.405195   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:18.405274   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:18.405369   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHUsername
	I0921 15:26:18.405482   10389 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:18.405601   10389 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.30 22 <nil> <nil>}
	I0921 15:26:18.405610   10389 main.go:134] libmachine: About to run SSH command:
	cat /etc/os-release
	I0921 15:26:18.483176   10389 main.go:134] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2021.02.12-1-g1be7c81-dirty
	ID=buildroot
	VERSION_ID=2021.02.12
	PRETTY_NAME="Buildroot 2021.02.12"
	
	I0921 15:26:18.483226   10389 main.go:134] libmachine: found compatible host: buildroot
	I0921 15:26:18.483233   10389 main.go:134] libmachine: Provisioning with buildroot...
	I0921 15:26:18.483245   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetMachineName
	I0921 15:26:18.483380   10389 buildroot.go:166] provisioning hostname "false-20220921151637-3535"
	I0921 15:26:18.483392   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetMachineName
	I0921 15:26:18.483485   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHHostname
	I0921 15:26:18.483579   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHPort
	I0921 15:26:18.483675   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:18.483757   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:18.483857   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHUsername
	I0921 15:26:18.483983   10389 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:18.484098   10389 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.30 22 <nil> <nil>}
	I0921 15:26:18.484107   10389 main.go:134] libmachine: About to run SSH command:
	sudo hostname false-20220921151637-3535 && echo "false-20220921151637-3535" | sudo tee /etc/hostname
	I0921 15:26:18.570488   10389 main.go:134] libmachine: SSH cmd err, output: <nil>: false-20220921151637-3535
	
	I0921 15:26:18.570510   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHHostname
	I0921 15:26:18.570653   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHPort
	I0921 15:26:18.570761   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:18.570862   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:18.570935   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHUsername
	I0921 15:26:18.571055   10389 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:18.571174   10389 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.30 22 <nil> <nil>}
	I0921 15:26:18.571186   10389 main.go:134] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfalse-20220921151637-3535' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 false-20220921151637-3535/g' /etc/hosts;
				else 
					echo '127.0.1.1 false-20220921151637-3535' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0921 15:26:18.653580   10389 main.go:134] libmachine: SSH cmd err, output: <nil>: 
	I0921 15:26:18.653600   10389 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube CaCertPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/key.pem ServerCertRemotePath
:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube}
	I0921 15:26:18.653620   10389 buildroot.go:174] setting up certificates
	I0921 15:26:18.653630   10389 provision.go:83] configureAuth start
	I0921 15:26:18.653637   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetMachineName
	I0921 15:26:18.653765   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetIP
	I0921 15:26:18.653853   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHHostname
	I0921 15:26:18.653932   10389 provision.go:138] copyHostCerts
	I0921 15:26:18.654006   10389 exec_runner.go:144] found /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.pem, removing ...
	I0921 15:26:18.654013   10389 exec_runner.go:207] rm: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.pem
	I0921 15:26:18.654127   10389 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.pem (1078 bytes)
	I0921 15:26:18.654316   10389 exec_runner.go:144] found /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/cert.pem, removing ...
	I0921 15:26:18.654322   10389 exec_runner.go:207] rm: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/cert.pem
	I0921 15:26:18.654389   10389 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/cert.pem (1123 bytes)
	I0921 15:26:18.654553   10389 exec_runner.go:144] found /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/key.pem, removing ...
	I0921 15:26:18.654559   10389 exec_runner.go:207] rm: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/key.pem
	I0921 15:26:18.654614   10389 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/key.pem (1679 bytes)
	I0921 15:26:18.654728   10389 provision.go:112] generating server cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca-key.pem org=jenkins.false-20220921151637-3535 san=[192.168.64.30 192.168.64.30 localhost 127.0.0.1 minikube false-20220921151637-3535]
	I0921 15:26:18.931086   10389 provision.go:172] copyRemoteCerts
	I0921 15:26:18.931145   10389 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0921 15:26:18.931162   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHHostname
	I0921 15:26:18.931342   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHPort
	I0921 15:26:18.931454   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:18.931547   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHUsername
	I0921 15:26:18.931640   10389 sshutil.go:53] new ssh client: &{IP:192.168.64.30 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/false-20220921151637-3535/id_rsa Username:docker}
	I0921 15:26:18.977451   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0921 15:26:18.993393   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/server.pem --> /etc/docker/server.pem (1249 bytes)
	I0921 15:26:19.009261   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0921 15:26:19.024820   10389 provision.go:86] duration metric: configureAuth took 371.177848ms
	I0921 15:26:19.024832   10389 buildroot.go:189] setting minikube options for container-runtime
	I0921 15:26:19.024951   10389 config.go:180] Loaded profile config "false-20220921151637-3535": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.2
	I0921 15:26:19.024965   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .DriverName
	I0921 15:26:19.025081   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHHostname
	I0921 15:26:19.025169   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHPort
	I0921 15:26:19.025260   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.025332   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.025427   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHUsername
	I0921 15:26:19.025536   10389 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:19.025635   10389 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.30 22 <nil> <nil>}
	I0921 15:26:19.025643   10389 main.go:134] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0921 15:26:19.103232   10389 main.go:134] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0921 15:26:19.103245   10389 buildroot.go:70] root file system type: tmpfs
	I0921 15:26:19.103367   10389 provision.go:309] Updating docker unit: /lib/systemd/system/docker.service ...
	I0921 15:26:19.103382   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHHostname
	I0921 15:26:19.103506   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHPort
	I0921 15:26:19.103596   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.103680   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.103774   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHUsername
	I0921 15:26:19.103895   10389 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:19.103995   10389 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.30 22 <nil> <nil>}
	I0921 15:26:19.104045   10389 main.go:134] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0921 15:26:19.189517   10389 main.go:134] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0921 15:26:19.189540   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHHostname
	I0921 15:26:19.189677   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHPort
	I0921 15:26:19.189768   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.189857   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.189943   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHUsername
	I0921 15:26:19.190071   10389 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:19.190182   10389 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.30 22 <nil> <nil>}
	I0921 15:26:19.190195   10389 main.go:134] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0921 15:26:19.657263   10389 main.go:134] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0921 15:26:19.657285   10389 main.go:134] libmachine: Checking connection to Docker...
	I0921 15:26:19.657293   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetURL
	I0921 15:26:19.657424   10389 main.go:134] libmachine: Docker is up and running!
	I0921 15:26:19.657433   10389 main.go:134] libmachine: Reticulating splines...
	I0921 15:26:19.657441   10389 client.go:171] LocalClient.Create took 10.876166724s
	I0921 15:26:19.657453   10389 start.go:167] duration metric: libmachine.API.Create for "false-20220921151637-3535" took 10.876232302s
	I0921 15:26:19.657465   10389 start.go:300] post-start starting for "false-20220921151637-3535" (driver="hyperkit")
	I0921 15:26:19.657470   10389 start.go:328] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0921 15:26:19.657481   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .DriverName
	I0921 15:26:19.657606   10389 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0921 15:26:19.657623   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHHostname
	I0921 15:26:19.657718   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHPort
	I0921 15:26:19.657815   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.657900   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHUsername
	I0921 15:26:19.657993   10389 sshutil.go:53] new ssh client: &{IP:192.168.64.30 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/false-20220921151637-3535/id_rsa Username:docker}
	I0921 15:26:19.701002   10389 ssh_runner.go:195] Run: cat /etc/os-release
	I0921 15:26:19.703660   10389 info.go:137] Remote host: Buildroot 2021.02.12
	I0921 15:26:19.703675   10389 filesync.go:126] Scanning /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/addons for local assets ...
	I0921 15:26:19.703763   10389 filesync.go:126] Scanning /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files for local assets ...
	I0921 15:26:19.703898   10389 filesync.go:149] local asset: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files/etc/ssl/certs/35352.pem -> 35352.pem in /etc/ssl/certs
	I0921 15:26:19.704044   10389 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0921 15:26:19.710387   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files/etc/ssl/certs/35352.pem --> /etc/ssl/certs/35352.pem (1708 bytes)
	I0921 15:26:19.725495   10389 start.go:303] post-start completed in 68.018939ms
	I0921 15:26:19.725521   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetConfigRaw
	I0921 15:26:19.726077   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetIP
	I0921 15:26:19.726225   10389 profile.go:148] Saving config to /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/config.json ...
	I0921 15:26:19.726508   10389 start.go:128] duration metric: createHost completed in 10.995583539s
	I0921 15:26:19.726524   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHHostname
	I0921 15:26:19.726609   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHPort
	I0921 15:26:19.726688   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.726756   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.726824   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHUsername
	I0921 15:26:19.726940   10389 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:19.727032   10389 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.30 22 <nil> <nil>}
	I0921 15:26:19.727039   10389 main.go:134] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0921 15:26:19.803566   10389 main.go:134] libmachine: SSH cmd err, output: <nil>: 1663799179.904471962
	
	I0921 15:26:19.803578   10389 fix.go:207] guest clock: 1663799179.904471962
	I0921 15:26:19.803583   10389 fix.go:220] Guest: 2022-09-21 15:26:19.904471962 -0700 PDT Remote: 2022-09-21 15:26:19.726515 -0700 PDT m=+11.397811697 (delta=177.956962ms)
	I0921 15:26:19.803600   10389 fix.go:191] guest clock delta is within tolerance: 177.956962ms
	I0921 15:26:19.803604   10389 start.go:83] releasing machines lock for "false-20220921151637-3535", held for 11.072844405s
	I0921 15:26:19.803620   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .DriverName
	I0921 15:26:19.803781   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetIP
	I0921 15:26:19.803886   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .DriverName
	I0921 15:26:19.803980   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .DriverName
	I0921 15:26:19.804107   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .DriverName
	I0921 15:26:19.804405   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .DriverName
	I0921 15:26:19.804511   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .DriverName
	I0921 15:26:19.804569   10389 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0921 15:26:19.804599   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHHostname
	I0921 15:26:19.804676   10389 ssh_runner.go:195] Run: systemctl --version
	I0921 15:26:19.804691   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHHostname
	I0921 15:26:19.804696   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHPort
	I0921 15:26:19.804788   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.804809   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHPort
	I0921 15:26:19.804910   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHUsername
	I0921 15:26:19.804933   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.804984   10389 sshutil.go:53] new ssh client: &{IP:192.168.64.30 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/false-20220921151637-3535/id_rsa Username:docker}
	I0921 15:26:19.805022   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHUsername
	I0921 15:26:19.805139   10389 sshutil.go:53] new ssh client: &{IP:192.168.64.30 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/false-20220921151637-3535/id_rsa Username:docker}
	I0921 15:26:19.847227   10389 preload.go:132] Checking if preload exists for k8s version v1.25.2 and runtime docker
	I0921 15:26:19.847314   10389 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0921 15:26:19.886987   10389 docker.go:611] Got preloaded images: 
	I0921 15:26:19.887002   10389 docker.go:617] registry.k8s.io/kube-apiserver:v1.25.2 wasn't preloaded
	I0921 15:26:19.887058   10389 ssh_runner.go:195] Run: sudo cat /var/lib/docker/image/overlay2/repositories.json
	I0921 15:26:19.893540   10389 ssh_runner.go:195] Run: which lz4
	I0921 15:26:19.895930   10389 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4
	I0921 15:26:19.898413   10389 ssh_runner.go:352] existence check for /preloaded.tar.lz4: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/preloaded.tar.lz4': No such file or directory
	I0921 15:26:19.898432   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.25.2-docker-overlay2-amd64.tar.lz4 --> /preloaded.tar.lz4 (404136294 bytes)
	I0921 15:26:21.239426   10389 docker.go:576] Took 1.343526 seconds to copy over tarball
	I0921 15:26:21.239490   10389 ssh_runner.go:195] Run: sudo tar -I lz4 -C /var -xf /preloaded.tar.lz4
	I0921 15:26:24.582087   10389 ssh_runner.go:235] Completed: sudo tar -I lz4 -C /var -xf /preloaded.tar.lz4: (3.342576242s)
	I0921 15:26:24.582101   10389 ssh_runner.go:146] rm: /preloaded.tar.lz4
	I0921 15:26:24.608006   10389 ssh_runner.go:195] Run: sudo cat /var/lib/docker/image/overlay2/repositories.json
	I0921 15:26:24.614121   10389 ssh_runner.go:362] scp memory --> /var/lib/docker/image/overlay2/repositories.json (2628 bytes)
	I0921 15:26:24.625086   10389 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0921 15:26:24.705194   10389 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0921 15:26:25.931663   10389 ssh_runner.go:235] Completed: sudo systemctl restart docker: (1.226446575s)
	I0921 15:26:25.931758   10389 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0921 15:26:25.941064   10389 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0921 15:26:25.952201   10389 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0921 15:26:25.960686   10389 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0921 15:26:25.983070   10389 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0921 15:26:25.991760   10389 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	image-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0921 15:26:26.004137   10389 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0921 15:26:26.084992   10389 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0921 15:26:26.179551   10389 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0921 15:26:26.278839   10389 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0921 15:26:27.498830   10389 ssh_runner.go:235] Completed: sudo systemctl restart docker: (1.219969179s)
	I0921 15:26:27.498903   10389 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0921 15:26:27.582227   10389 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0921 15:26:27.670077   10389 ssh_runner.go:195] Run: sudo systemctl start cri-docker.socket
	I0921 15:26:27.680350   10389 start.go:450] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0921 15:26:27.680426   10389 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0921 15:26:27.684229   10389 start.go:471] Will wait 60s for crictl version
	I0921 15:26:27.684283   10389 ssh_runner.go:195] Run: sudo crictl version
	I0921 15:26:27.710285   10389 start.go:480] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  20.10.18
	RuntimeApiVersion:  1.41.0
	I0921 15:26:27.710350   10389 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0921 15:26:27.730543   10389 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0921 15:26:27.776346   10389 out.go:204] * Preparing Kubernetes v1.25.2 on Docker 20.10.18 ...
	I0921 15:26:27.776499   10389 ssh_runner.go:195] Run: grep 192.168.64.1	host.minikube.internal$ /etc/hosts
	I0921 15:26:27.779532   10389 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.64.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0921 15:26:27.786983   10389 localpath.go:92] copying /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/client.crt -> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/client.crt
	I0921 15:26:27.787207   10389 localpath.go:117] copying /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/client.key -> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/client.key
	I0921 15:26:27.787377   10389 preload.go:132] Checking if preload exists for k8s version v1.25.2 and runtime docker
	I0921 15:26:27.787423   10389 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0921 15:26:27.803222   10389 docker.go:611] Got preloaded images: -- stdout --
	registry.k8s.io/kube-apiserver:v1.25.2
	registry.k8s.io/kube-scheduler:v1.25.2
	registry.k8s.io/kube-controller-manager:v1.25.2
	registry.k8s.io/kube-proxy:v1.25.2
	registry.k8s.io/pause:3.8
	registry.k8s.io/etcd:3.5.4-0
	registry.k8s.io/coredns/coredns:v1.9.3
	gcr.io/k8s-minikube/storage-provisioner:v5
	
	-- /stdout --
	I0921 15:26:27.803238   10389 docker.go:542] Images already preloaded, skipping extraction
	I0921 15:26:27.803305   10389 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0921 15:26:27.818382   10389 docker.go:611] Got preloaded images: -- stdout --
	registry.k8s.io/kube-apiserver:v1.25.2
	registry.k8s.io/kube-scheduler:v1.25.2
	registry.k8s.io/kube-controller-manager:v1.25.2
	registry.k8s.io/kube-proxy:v1.25.2
	registry.k8s.io/pause:3.8
	registry.k8s.io/etcd:3.5.4-0
	registry.k8s.io/coredns/coredns:v1.9.3
	gcr.io/k8s-minikube/storage-provisioner:v5
	
	-- /stdout --
	I0921 15:26:27.818399   10389 cache_images.go:84] Images are preloaded, skipping loading
	I0921 15:26:27.818461   10389 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0921 15:26:27.839813   10389 cni.go:95] Creating CNI manager for "false"
	I0921 15:26:27.839834   10389 kubeadm.go:87] Using pod CIDR: 10.244.0.0/16
	I0921 15:26:27.839848   10389 kubeadm.go:156] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.64.30 APIServerPort:8443 KubernetesVersion:v1.25.2 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:false-20220921151637-3535 NodeName:false-20220921151637-3535 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.64.30"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.64.30 CgroupDriver:systemd ClientCAFile:/var/lib/minikube/certs/ca.cr
t StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false}
	I0921 15:26:27.839927   10389 kubeadm.go:161] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.64.30
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: /var/run/cri-dockerd.sock
	  name: "false-20220921151637-3535"
	  kubeletExtraArgs:
	    node-ip: 192.168.64.30
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.64.30"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.25.2
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: systemd
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0921 15:26:27.839993   10389 kubeadm.go:962] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.25.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --container-runtime=remote --container-runtime-endpoint=/var/run/cri-dockerd.sock --hostname-override=false-20220921151637-3535 --image-service-endpoint=/var/run/cri-dockerd.sock --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.64.30 --runtime-request-timeout=15m
	
	[Install]
	 config:
	{KubernetesVersion:v1.25.2 ClusterName:false-20220921151637-3535 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:false NodeIP: NodePort:8443 NodeName:}
	I0921 15:26:27.840044   10389 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.25.2
	I0921 15:26:27.846485   10389 binaries.go:44] Found k8s binaries, skipping transfer
	I0921 15:26:27.846528   10389 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I0921 15:26:27.852711   10389 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (488 bytes)
	I0921 15:26:27.863719   10389 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0921 15:26:27.874539   10389 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2050 bytes)
	I0921 15:26:27.885620   10389 ssh_runner.go:195] Run: grep 192.168.64.30	control-plane.minikube.internal$ /etc/hosts
	I0921 15:26:27.887836   10389 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.64.30	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0921 15:26:27.895111   10389 certs.go:54] Setting up /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535 for IP: 192.168.64.30
	I0921 15:26:27.895206   10389 certs.go:182] skipping minikubeCA CA generation: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.key
	I0921 15:26:27.895255   10389 certs.go:182] skipping proxyClientCA CA generation: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/proxy-client-ca.key
	I0921 15:26:27.895337   10389 certs.go:298] skipping minikube-user signed cert generation: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/client.key
	I0921 15:26:27.895361   10389 certs.go:302] generating minikube signed cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/apiserver.key.8d1fc39b
	I0921 15:26:27.895377   10389 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/apiserver.crt.8d1fc39b with IP's: [192.168.64.30 10.96.0.1 127.0.0.1 10.0.0.1]
	I0921 15:26:28.090626   10389 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/apiserver.crt.8d1fc39b ...
	I0921 15:26:28.090639   10389 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/apiserver.crt.8d1fc39b: {Name:mkd0021f0880c17472bc34f2bb7b8af87d7a861d Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0921 15:26:28.090958   10389 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/apiserver.key.8d1fc39b ...
	I0921 15:26:28.090971   10389 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/apiserver.key.8d1fc39b: {Name:mk0105b4976084bcdc477e16d22340c1f19a3c15 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0921 15:26:28.091184   10389 certs.go:320] copying /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/apiserver.crt.8d1fc39b -> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/apiserver.crt
	I0921 15:26:28.091356   10389 certs.go:324] copying /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/apiserver.key.8d1fc39b -> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/apiserver.key
	I0921 15:26:28.091534   10389 certs.go:302] generating aggregator signed cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/proxy-client.key
	I0921 15:26:28.091547   10389 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/proxy-client.crt with IP's: []
	I0921 15:26:28.128749   10389 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/proxy-client.crt ...
	I0921 15:26:28.128759   10389 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/proxy-client.crt: {Name:mkb235bcbbe39e8b7fc7fa2af71bd625a04514fb Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0921 15:26:28.129197   10389 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/proxy-client.key ...
	I0921 15:26:28.129204   10389 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/proxy-client.key: {Name:mkc7b1d50dce94488cf946b55e321c2fd8195b2c Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0921 15:26:28.129644   10389 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/3535.pem (1338 bytes)
	W0921 15:26:28.129681   10389 certs.go:384] ignoring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/3535_empty.pem, impossibly tiny 0 bytes
	I0921 15:26:28.129689   10389 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca-key.pem (1679 bytes)
	I0921 15:26:28.129738   10389 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca.pem (1078 bytes)
	I0921 15:26:28.129767   10389 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/cert.pem (1123 bytes)
	I0921 15:26:28.129794   10389 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/key.pem (1679 bytes)
	I0921 15:26:28.129854   10389 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files/etc/ssl/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files/etc/ssl/certs/35352.pem (1708 bytes)
	I0921 15:26:28.130421   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1399 bytes)
	I0921 15:26:28.147670   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0921 15:26:28.163433   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0921 15:26:28.178707   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0921 15:26:28.193799   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0921 15:26:28.208841   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0921 15:26:28.224170   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0921 15:26:28.239235   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0921 15:26:28.254997   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0921 15:26:28.270476   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/3535.pem --> /usr/share/ca-certificates/3535.pem (1338 bytes)
	I0921 15:26:28.285761   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files/etc/ssl/certs/35352.pem --> /usr/share/ca-certificates/35352.pem (1708 bytes)
	I0921 15:26:28.300863   10389 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0921 15:26:28.311541   10389 ssh_runner.go:195] Run: openssl version
	I0921 15:26:28.314918   10389 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/35352.pem && ln -fs /usr/share/ca-certificates/35352.pem /etc/ssl/certs/35352.pem"
	I0921 15:26:28.322006   10389 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/35352.pem
	I0921 15:26:28.324825   10389 certs.go:431] hashing: -rw-r--r-- 1 root root 1708 Sep 21 21:31 /usr/share/ca-certificates/35352.pem
	I0921 15:26:28.324854   10389 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/35352.pem
	I0921 15:26:28.328317   10389 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/35352.pem /etc/ssl/certs/3ec20f2e.0"
	I0921 15:26:28.335399   10389 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0921 15:26:28.342321   10389 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0921 15:26:28.345213   10389 certs.go:431] hashing: -rw-r--r-- 1 root root 1111 Sep 21 21:27 /usr/share/ca-certificates/minikubeCA.pem
	I0921 15:26:28.345248   10389 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0921 15:26:28.348680   10389 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0921 15:26:28.355668   10389 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/3535.pem && ln -fs /usr/share/ca-certificates/3535.pem /etc/ssl/certs/3535.pem"
	I0921 15:26:28.362704   10389 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3535.pem
	I0921 15:26:28.365564   10389 certs.go:431] hashing: -rw-r--r-- 1 root root 1338 Sep 21 21:31 /usr/share/ca-certificates/3535.pem
	I0921 15:26:28.365597   10389 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3535.pem
	I0921 15:26:28.369054   10389 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/3535.pem /etc/ssl/certs/51391683.0"
	I0921 15:26:28.375971   10389 kubeadm.go:396] StartCluster: {Name:false-20220921151637-3535 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube/iso/minikube-v1.27.0-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase:v0.0.34@sha256:f2a1e577e43fd6769f35cdb938f6d21c3dacfd763062d119cade738fa244720c Memory:2048 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.25.2 ClusterName:false-20220921151637
-3535 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:false NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP:192.168.64.30 Port:8443 KubernetesVersion:v1.25.2 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:5m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p M
ountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath:/opt/socket_vmnet/bin/socket_vmnet_client SocketVMnetPath:/var/run/socket_vmnet}
	I0921 15:26:28.393673   10389 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0921 15:26:28.410852   10389 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0921 15:26:28.417363   10389 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I0921 15:26:28.423501   10389 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I0921 15:26:28.429757   10389 kubeadm.go:152] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I0921 15:26:28.429778   10389 ssh_runner.go:286] Start: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.2:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,Mem"
	I0921 15:26:28.485563   10389 kubeadm.go:317] [init] Using Kubernetes version: v1.25.2
	I0921 15:26:28.485628   10389 kubeadm.go:317] [preflight] Running pre-flight checks
	I0921 15:26:28.613102   10389 kubeadm.go:317] [preflight] Pulling images required for setting up a Kubernetes cluster
	I0921 15:26:28.613192   10389 kubeadm.go:317] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I0921 15:26:28.613262   10389 kubeadm.go:317] [preflight] You can also perform this action in beforehand using 'kubeadm config images pull'
	I0921 15:26:28.713134   10389 kubeadm.go:317] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I0921 15:26:29.173173   10408 ssh_runner.go:235] Completed: sudo systemctl restart docker: (8.055980768s)
	I0921 15:26:29.173240   10408 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0921 15:26:29.288535   10408 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0921 15:26:29.417731   10408 ssh_runner.go:195] Run: sudo systemctl start cri-docker.socket
	I0921 15:26:29.433270   10408 start.go:450] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0921 15:26:29.433356   10408 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0921 15:26:29.447293   10408 start.go:471] Will wait 60s for crictl version
	I0921 15:26:29.447353   10408 ssh_runner.go:195] Run: sudo crictl version
	I0921 15:26:29.482799   10408 start.go:480] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  20.10.18
	RuntimeApiVersion:  1.41.0
	I0921 15:26:29.482858   10408 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0921 15:26:29.651357   10408 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0921 15:26:29.808439   10408 out.go:204] * Preparing Kubernetes v1.25.2 on Docker 20.10.18 ...
	I0921 15:26:29.808534   10408 ssh_runner.go:195] Run: grep 192.168.64.1	host.minikube.internal$ /etc/hosts
	I0921 15:26:29.818111   10408 preload.go:132] Checking if preload exists for k8s version v1.25.2 and runtime docker
	I0921 15:26:29.818177   10408 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0921 15:26:29.873620   10408 docker.go:611] Got preloaded images: -- stdout --
	registry.k8s.io/kube-apiserver:v1.25.2
	registry.k8s.io/kube-scheduler:v1.25.2
	registry.k8s.io/kube-controller-manager:v1.25.2
	registry.k8s.io/kube-proxy:v1.25.2
	registry.k8s.io/pause:3.8
	registry.k8s.io/etcd:3.5.4-0
	registry.k8s.io/coredns/coredns:v1.9.3
	k8s.gcr.io/pause:3.6
	gcr.io/k8s-minikube/storage-provisioner:v5
	
	-- /stdout --
	I0921 15:26:29.873633   10408 docker.go:542] Images already preloaded, skipping extraction
	I0921 15:26:29.873699   10408 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0921 15:26:29.929931   10408 docker.go:611] Got preloaded images: -- stdout --
	registry.k8s.io/kube-apiserver:v1.25.2
	registry.k8s.io/kube-scheduler:v1.25.2
	registry.k8s.io/kube-controller-manager:v1.25.2
	registry.k8s.io/kube-proxy:v1.25.2
	registry.k8s.io/pause:3.8
	registry.k8s.io/etcd:3.5.4-0
	registry.k8s.io/coredns/coredns:v1.9.3
	k8s.gcr.io/pause:3.6
	gcr.io/k8s-minikube/storage-provisioner:v5
	
	-- /stdout --
	I0921 15:26:29.929952   10408 cache_images.go:84] Images are preloaded, skipping loading
	I0921 15:26:29.930056   10408 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0921 15:26:30.064287   10408 cni.go:95] Creating CNI manager for ""
	I0921 15:26:30.064305   10408 cni.go:169] CNI unnecessary in this configuration, recommending no CNI
	I0921 15:26:30.064320   10408 kubeadm.go:87] Using pod CIDR: 10.244.0.0/16
	I0921 15:26:30.064331   10408 kubeadm.go:156] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.64.28 APIServerPort:8443 KubernetesVersion:v1.25.2 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:pause-20220921152522-3535 NodeName:pause-20220921152522-3535 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.64.28"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.64.28 CgroupDriver:systemd ClientCAFile:/var/lib/minikube/certs/ca.cr
t StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false}
	I0921 15:26:30.064423   10408 kubeadm.go:161] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.64.28
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: /var/run/cri-dockerd.sock
	  name: "pause-20220921152522-3535"
	  kubeletExtraArgs:
	    node-ip: 192.168.64.28
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.64.28"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.25.2
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: systemd
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0921 15:26:30.064505   10408 kubeadm.go:962] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.25.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --container-runtime=remote --container-runtime-endpoint=/var/run/cri-dockerd.sock --hostname-override=pause-20220921152522-3535 --image-service-endpoint=/var/run/cri-dockerd.sock --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.64.28 --runtime-request-timeout=15m
	
	[Install]
	 config:
	{KubernetesVersion:v1.25.2 ClusterName:pause-20220921152522-3535 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:}
	I0921 15:26:30.064579   10408 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.25.2
	I0921 15:26:30.076550   10408 binaries.go:44] Found k8s binaries, skipping transfer
	I0921 15:26:30.076638   10408 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I0921 15:26:30.090012   10408 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (488 bytes)
	I0921 15:26:30.137803   10408 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0921 15:26:30.178146   10408 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2050 bytes)
	I0921 15:26:30.203255   10408 ssh_runner.go:195] Run: grep 192.168.64.28	control-plane.minikube.internal$ /etc/hosts
	I0921 15:26:30.209779   10408 certs.go:54] Setting up /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535 for IP: 192.168.64.28
	I0921 15:26:30.209879   10408 certs.go:182] skipping minikubeCA CA generation: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.key
	I0921 15:26:30.209934   10408 certs.go:182] skipping proxyClientCA CA generation: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/proxy-client-ca.key
	I0921 15:26:30.210019   10408 certs.go:298] skipping minikube-user signed cert generation: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/client.key
	I0921 15:26:30.210082   10408 certs.go:298] skipping minikube signed cert generation: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/apiserver.key.6733b561
	I0921 15:26:30.210133   10408 certs.go:298] skipping aggregator signed cert generation: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/proxy-client.key
	I0921 15:26:30.210333   10408 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/3535.pem (1338 bytes)
	W0921 15:26:30.210375   10408 certs.go:384] ignoring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/3535_empty.pem, impossibly tiny 0 bytes
	I0921 15:26:30.210388   10408 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca-key.pem (1679 bytes)
	I0921 15:26:30.210421   10408 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca.pem (1078 bytes)
	I0921 15:26:30.210453   10408 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/cert.pem (1123 bytes)
	I0921 15:26:30.210483   10408 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/key.pem (1679 bytes)
	I0921 15:26:30.210550   10408 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files/etc/ssl/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files/etc/ssl/certs/35352.pem (1708 bytes)
	I0921 15:26:30.211086   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1399 bytes)
	I0921 15:26:30.279069   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0921 15:26:30.343250   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0921 15:26:30.413180   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0921 15:26:30.448798   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0921 15:26:30.476175   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0921 15:26:30.497204   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0921 15:26:30.524103   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0921 15:26:30.558966   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files/etc/ssl/certs/35352.pem --> /usr/share/ca-certificates/35352.pem (1708 bytes)
	I0921 15:26:30.576319   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0921 15:26:30.592912   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/3535.pem --> /usr/share/ca-certificates/3535.pem (1338 bytes)
	I0921 15:26:30.609099   10408 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0921 15:26:30.627179   10408 ssh_runner.go:195] Run: openssl version
	I0921 15:26:30.632801   10408 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/3535.pem && ln -fs /usr/share/ca-certificates/3535.pem /etc/ssl/certs/3535.pem"
	I0921 15:26:30.641473   10408 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3535.pem
	I0921 15:26:30.645794   10408 certs.go:431] hashing: -rw-r--r-- 1 root root 1338 Sep 21 21:31 /usr/share/ca-certificates/3535.pem
	I0921 15:26:30.645836   10408 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3535.pem
	I0921 15:26:30.649794   10408 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/3535.pem /etc/ssl/certs/51391683.0"
	I0921 15:26:30.657630   10408 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/35352.pem && ln -fs /usr/share/ca-certificates/35352.pem /etc/ssl/certs/35352.pem"
	I0921 15:26:30.665747   10408 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/35352.pem
	I0921 15:26:30.669804   10408 certs.go:431] hashing: -rw-r--r-- 1 root root 1708 Sep 21 21:31 /usr/share/ca-certificates/35352.pem
	I0921 15:26:30.669850   10408 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/35352.pem
	I0921 15:26:30.679638   10408 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/35352.pem /etc/ssl/certs/3ec20f2e.0"
	I0921 15:26:30.700907   10408 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0921 15:26:30.734369   10408 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0921 15:26:30.762750   10408 certs.go:431] hashing: -rw-r--r-- 1 root root 1111 Sep 21 21:27 /usr/share/ca-certificates/minikubeCA.pem
	I0921 15:26:30.762827   10408 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0921 15:26:30.777627   10408 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0921 15:26:30.785856   10408 kubeadm.go:396] StartCluster: {Name:pause-20220921152522-3535 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube/iso/minikube-v1.27.0-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase:v0.0.34@sha256:f2a1e577e43fd6769f35cdb938f6d21c3dacfd763062d119cade738fa244720c Memory:2048 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.25.2 ClusterName:pause-20220921152522
-3535 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP:192.168.64.28 Port:8443 KubernetesVersion:v1.25.2 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountU
ID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath:/opt/socket_vmnet/bin/socket_vmnet_client SocketVMnetPath:/var/run/socket_vmnet}
	I0921 15:26:30.785963   10408 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0921 15:26:30.816264   10408 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0921 15:26:30.823179   10408 kubeadm.go:411] found existing configuration files, will attempt cluster restart
	I0921 15:26:30.823195   10408 kubeadm.go:627] restartCluster start
	I0921 15:26:30.823236   10408 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I0921 15:26:30.837045   10408 kubeadm.go:127] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I0921 15:26:30.837457   10408 kubeconfig.go:92] found "pause-20220921152522-3535" server: "https://192.168.64.28:8443"
	I0921 15:26:30.837839   10408 kapi.go:59] client config for pause-20220921152522-3535: &rest.Config{Host:"https://192.168.64.28:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/client.crt", KeyFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/cl
ient.key", CAFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x233b400), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0921 15:26:30.838375   10408 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I0921 15:26:30.852535   10408 api_server.go:165] Checking apiserver status ...
	I0921 15:26:30.852588   10408 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0921 15:26:30.868059   10408 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/4520/cgroup
	I0921 15:26:30.876185   10408 api_server.go:181] apiserver freezer: "2:freezer:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadc22aaa89e8234f176d6344e50152f4.slice/docker-3a4741e1fe3c0996cab4975bd514e9991794f86cf96c9fe0863c714a6d86e26c.scope"
	I0921 15:26:30.876238   10408 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadc22aaa89e8234f176d6344e50152f4.slice/docker-3a4741e1fe3c0996cab4975bd514e9991794f86cf96c9fe0863c714a6d86e26c.scope/freezer.state
	I0921 15:26:30.912452   10408 api_server.go:203] freezer state: "THAWED"
	I0921 15:26:30.912472   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:28.751035   10389 out.go:204]   - Generating certificates and keys ...
	I0921 15:26:28.751152   10389 kubeadm.go:317] [certs] Using existing ca certificate authority
	I0921 15:26:28.751236   10389 kubeadm.go:317] [certs] Using existing apiserver certificate and key on disk
	I0921 15:26:28.782482   10389 kubeadm.go:317] [certs] Generating "apiserver-kubelet-client" certificate and key
	I0921 15:26:29.137189   10389 kubeadm.go:317] [certs] Generating "front-proxy-ca" certificate and key
	I0921 15:26:29.241745   10389 kubeadm.go:317] [certs] Generating "front-proxy-client" certificate and key
	I0921 15:26:29.350166   10389 kubeadm.go:317] [certs] Generating "etcd/ca" certificate and key
	I0921 15:26:29.505698   10389 kubeadm.go:317] [certs] Generating "etcd/server" certificate and key
	I0921 15:26:29.505932   10389 kubeadm.go:317] [certs] etcd/server serving cert is signed for DNS names [false-20220921151637-3535 localhost] and IPs [192.168.64.30 127.0.0.1 ::1]
	I0921 15:26:29.604706   10389 kubeadm.go:317] [certs] Generating "etcd/peer" certificate and key
	I0921 15:26:29.604909   10389 kubeadm.go:317] [certs] etcd/peer serving cert is signed for DNS names [false-20220921151637-3535 localhost] and IPs [192.168.64.30 127.0.0.1 ::1]
	I0921 15:26:29.834088   10389 kubeadm.go:317] [certs] Generating "etcd/healthcheck-client" certificate and key
	I0921 15:26:29.943628   10389 kubeadm.go:317] [certs] Generating "apiserver-etcd-client" certificate and key
	I0921 15:26:30.177452   10389 kubeadm.go:317] [certs] Generating "sa" key and public key
	I0921 15:26:30.177562   10389 kubeadm.go:317] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I0921 15:26:30.679764   10389 kubeadm.go:317] [kubeconfig] Writing "admin.conf" kubeconfig file
	I0921 15:26:30.762950   10389 kubeadm.go:317] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I0921 15:26:30.975611   10389 kubeadm.go:317] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I0921 15:26:31.368343   10389 kubeadm.go:317] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I0921 15:26:31.380985   10389 kubeadm.go:317] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I0921 15:26:31.381763   10389 kubeadm.go:317] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I0921 15:26:31.381810   10389 kubeadm.go:317] [kubelet-start] Starting the kubelet
	I0921 15:26:31.468060   10389 kubeadm.go:317] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I0921 15:26:31.487973   10389 out.go:204]   - Booting up control plane ...
	I0921 15:26:31.488058   10389 kubeadm.go:317] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I0921 15:26:31.488140   10389 kubeadm.go:317] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I0921 15:26:31.488216   10389 kubeadm.go:317] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I0921 15:26:31.488288   10389 kubeadm.go:317] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I0921 15:26:31.488408   10389 kubeadm.go:317] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests". This can take up to 4m0s
	I0921 15:26:35.914013   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I0921 15:26:35.914061   10408 retry.go:31] will retry after 263.082536ms: state is "Stopped"
	I0921 15:26:36.179260   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:41.180983   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I0921 15:26:41.181007   10408 retry.go:31] will retry after 381.329545ms: state is "Stopped"
	I0921 15:26:43.469751   10389 kubeadm.go:317] [apiclient] All control plane components are healthy after 12.003918 seconds
	I0921 15:26:43.469852   10389 kubeadm.go:317] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I0921 15:26:43.477591   10389 kubeadm.go:317] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I0921 15:26:44.989240   10389 kubeadm.go:317] [upload-certs] Skipping phase. Please see --upload-certs
	I0921 15:26:44.989436   10389 kubeadm.go:317] [mark-control-plane] Marking the node false-20220921151637-3535 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I0921 15:26:45.496387   10389 kubeadm.go:317] [bootstrap-token] Using token: gw23ty.315hs4knjisv0ijr
	I0921 15:26:41.563913   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:45.534959   10389 out.go:204]   - Configuring RBAC rules ...
	I0921 15:26:45.535164   10389 kubeadm.go:317] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I0921 15:26:45.535348   10389 kubeadm.go:317] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I0921 15:26:45.575312   10389 kubeadm.go:317] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I0921 15:26:45.577832   10389 kubeadm.go:317] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I0921 15:26:45.580659   10389 kubeadm.go:317] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I0921 15:26:45.582707   10389 kubeadm.go:317] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I0921 15:26:45.589329   10389 kubeadm.go:317] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I0921 15:26:45.765645   10389 kubeadm.go:317] [addons] Applied essential addon: CoreDNS
	I0921 15:26:45.903347   10389 kubeadm.go:317] [addons] Applied essential addon: kube-proxy
	I0921 15:26:45.903987   10389 kubeadm.go:317] 
	I0921 15:26:45.904052   10389 kubeadm.go:317] Your Kubernetes control-plane has initialized successfully!
	I0921 15:26:45.904063   10389 kubeadm.go:317] 
	I0921 15:26:45.904125   10389 kubeadm.go:317] To start using your cluster, you need to run the following as a regular user:
	I0921 15:26:45.904133   10389 kubeadm.go:317] 
	I0921 15:26:45.904151   10389 kubeadm.go:317]   mkdir -p $HOME/.kube
	I0921 15:26:45.904270   10389 kubeadm.go:317]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I0921 15:26:45.904382   10389 kubeadm.go:317]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I0921 15:26:45.904399   10389 kubeadm.go:317] 
	I0921 15:26:45.904507   10389 kubeadm.go:317] Alternatively, if you are the root user, you can run:
	I0921 15:26:45.904518   10389 kubeadm.go:317] 
	I0921 15:26:45.904599   10389 kubeadm.go:317]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I0921 15:26:45.904608   10389 kubeadm.go:317] 
	I0921 15:26:45.904652   10389 kubeadm.go:317] You should now deploy a pod network to the cluster.
	I0921 15:26:45.904743   10389 kubeadm.go:317] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I0921 15:26:45.904821   10389 kubeadm.go:317]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I0921 15:26:45.904853   10389 kubeadm.go:317] 
	I0921 15:26:45.904929   10389 kubeadm.go:317] You can now join any number of control-plane nodes by copying certificate authorities
	I0921 15:26:45.905009   10389 kubeadm.go:317] and service account keys on each node and then running the following as root:
	I0921 15:26:45.905013   10389 kubeadm.go:317] 
	I0921 15:26:45.905081   10389 kubeadm.go:317]   kubeadm join control-plane.minikube.internal:8443 --token gw23ty.315hs4knjisv0ijr \
	I0921 15:26:45.905165   10389 kubeadm.go:317] 	--discovery-token-ca-cert-hash sha256:706daf9048108456ab2312c550f8f0627aeca112971c3da5a874015a0cee155c \
	I0921 15:26:45.905182   10389 kubeadm.go:317] 	--control-plane 
	I0921 15:26:45.905187   10389 kubeadm.go:317] 
	I0921 15:26:45.905254   10389 kubeadm.go:317] Then you can join any number of worker nodes by running the following on each as root:
	I0921 15:26:45.905261   10389 kubeadm.go:317] 
	I0921 15:26:45.905329   10389 kubeadm.go:317] kubeadm join control-plane.minikube.internal:8443 --token gw23ty.315hs4knjisv0ijr \
	I0921 15:26:45.905405   10389 kubeadm.go:317] 	--discovery-token-ca-cert-hash sha256:706daf9048108456ab2312c550f8f0627aeca112971c3da5a874015a0cee155c 
	I0921 15:26:45.906103   10389 kubeadm.go:317] W0921 22:26:28.588830    1256 initconfiguration.go:119] Usage of CRI endpoints without URL scheme is deprecated and can cause kubelet errors in the future. Automatically prepending scheme "unix" to the "criSocket" with value "/var/run/cri-dockerd.sock". Please update your configuration!
	I0921 15:26:45.906192   10389 kubeadm.go:317] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I0921 15:26:45.906207   10389 cni.go:95] Creating CNI manager for "false"
	I0921 15:26:45.906225   10389 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0921 15:26:45.906290   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:45.906301   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl label nodes minikube.k8s.io/version=v1.27.0 minikube.k8s.io/commit=937c68716dfaac5b5ffa3b6655158d5d3472b8c4 minikube.k8s.io/name=false-20220921151637-3535 minikube.k8s.io/updated_at=2022_09_21T15_26_45_0700 minikube.k8s.io/primary=true --all --overwrite --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:46.087744   10389 ops.go:34] apiserver oom_adj: -16
	I0921 15:26:46.087768   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:46.661358   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:47.163233   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:47.661991   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:48.162015   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:46.564586   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I0921 15:26:46.766257   10408 api_server.go:165] Checking apiserver status ...
	I0921 15:26:46.766358   10408 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0921 15:26:46.776615   10408 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/4520/cgroup
	I0921 15:26:46.782756   10408 api_server.go:181] apiserver freezer: "2:freezer:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadc22aaa89e8234f176d6344e50152f4.slice/docker-3a4741e1fe3c0996cab4975bd514e9991794f86cf96c9fe0863c714a6d86e26c.scope"
	I0921 15:26:46.782801   10408 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadc22aaa89e8234f176d6344e50152f4.slice/docker-3a4741e1fe3c0996cab4975bd514e9991794f86cf96c9fe0863c714a6d86e26c.scope/freezer.state
	I0921 15:26:46.789298   10408 api_server.go:203] freezer state: "THAWED"
	I0921 15:26:46.789309   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:51.288815   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": read tcp 192.168.64.1:52998->192.168.64.28:8443: read: connection reset by peer
	I0921 15:26:51.288848   10408 retry.go:31] will retry after 242.214273ms: state is "Stopped"
	I0921 15:26:48.662979   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:49.163023   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:49.662057   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:50.162176   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:50.663300   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:51.162051   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:51.661237   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:52.161318   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:52.663231   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:53.162177   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:51.532207   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:51.632400   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": dial tcp 192.168.64.28:8443: connect: connection refused
	I0921 15:26:51.632425   10408 retry.go:31] will retry after 300.724609ms: state is "Stopped"
	I0921 15:26:51.934415   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:52.035144   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": dial tcp 192.168.64.28:8443: connect: connection refused
	I0921 15:26:52.035176   10408 retry.go:31] will retry after 427.113882ms: state is "Stopped"
	I0921 15:26:52.464328   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:52.566391   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": dial tcp 192.168.64.28:8443: connect: connection refused
	I0921 15:26:52.566426   10408 retry.go:31] will retry after 382.2356ms: state is "Stopped"
	I0921 15:26:52.948987   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:53.049570   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": dial tcp 192.168.64.28:8443: connect: connection refused
	I0921 15:26:53.049605   10408 retry.go:31] will retry after 505.529557ms: state is "Stopped"
	I0921 15:26:53.556334   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:53.658245   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": dial tcp 192.168.64.28:8443: connect: connection refused
	I0921 15:26:53.658268   10408 retry.go:31] will retry after 609.195524ms: state is "Stopped"
	I0921 15:26:54.269593   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:54.371296   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": dial tcp 192.168.64.28:8443: connect: connection refused
	I0921 15:26:54.371340   10408 retry.go:31] will retry after 858.741692ms: state is "Stopped"
	I0921 15:26:55.230116   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:55.331214   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": dial tcp 192.168.64.28:8443: connect: connection refused
	I0921 15:26:55.331251   10408 retry.go:31] will retry after 1.201160326s: state is "Stopped"
	I0921 15:26:53.661186   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:54.163293   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:54.661188   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:55.161203   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:55.661768   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:56.161278   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:56.661209   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:57.161293   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:57.227024   10389 kubeadm.go:1067] duration metric: took 11.320770189s to wait for elevateKubeSystemPrivileges.
	I0921 15:26:57.227047   10389 kubeadm.go:398] StartCluster complete in 28.851048117s
	I0921 15:26:57.227062   10389 settings.go:142] acquiring lock: {Name:mkb00f1de0b91d8f67bd982eab088d27845674b9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0921 15:26:57.227132   10389 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/kubeconfig
	I0921 15:26:57.227768   10389 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/kubeconfig: {Name:mka2f83e1cbd4124ff7179732fbb172d977cf2f4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0921 15:26:57.740783   10389 kapi.go:244] deployment "coredns" in namespace "kube-system" and context "false-20220921151637-3535" rescaled to 1
	I0921 15:26:57.740812   10389 start.go:211] Will wait 5m0s for node &{Name: IP:192.168.64.30 Port:8443 KubernetesVersion:v1.25.2 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0921 15:26:57.740821   10389 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.25.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I0921 15:26:57.740854   10389 addons.go:412] enableAddons start: toEnable=map[], additional=[]
	I0921 15:26:57.740962   10389 config.go:180] Loaded profile config "false-20220921151637-3535": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.2
	I0921 15:26:57.786566   10389 addons.go:65] Setting storage-provisioner=true in profile "false-20220921151637-3535"
	I0921 15:26:57.786585   10389 addons.go:153] Setting addon storage-provisioner=true in "false-20220921151637-3535"
	I0921 15:26:57.786585   10389 addons.go:65] Setting default-storageclass=true in profile "false-20220921151637-3535"
	I0921 15:26:57.786492   10389 out.go:177] * Verifying Kubernetes components...
	W0921 15:26:57.786593   10389 addons.go:162] addon storage-provisioner should already be in state true
	I0921 15:26:57.786605   10389 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "false-20220921151637-3535"
	I0921 15:26:57.786637   10389 host.go:66] Checking if "false-20220921151637-3535" exists ...
	I0921 15:26:57.823578   10389 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0921 15:26:57.824055   10389 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:26:57.824059   10389 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:26:57.824098   10389 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:26:57.824128   10389 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:26:57.831913   10389 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:53008
	I0921 15:26:57.831981   10389 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:53009
	I0921 15:26:57.832340   10389 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:26:57.832352   10389 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:26:57.832684   10389 main.go:134] libmachine: Using API Version  1
	I0921 15:26:57.832694   10389 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:26:57.832700   10389 main.go:134] libmachine: Using API Version  1
	I0921 15:26:57.832713   10389 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:26:57.832896   10389 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:26:57.832944   10389 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:26:57.832993   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetState
	I0921 15:26:57.833084   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0921 15:26:57.833170   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | hyperkit pid from json: 10400
	I0921 15:26:57.833345   10389 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:26:57.833360   10389 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:26:57.839848   10389 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:53012
	I0921 15:26:57.840218   10389 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:26:57.840571   10389 main.go:134] libmachine: Using API Version  1
	I0921 15:26:57.840590   10389 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:26:57.840793   10389 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:26:57.840888   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetState
	I0921 15:26:57.840964   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0921 15:26:57.841057   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | hyperkit pid from json: 10400
	I0921 15:26:57.841584   10389 addons.go:153] Setting addon default-storageclass=true in "false-20220921151637-3535"
	W0921 15:26:57.841596   10389 addons.go:162] addon default-storageclass should already be in state true
	I0921 15:26:57.841612   10389 host.go:66] Checking if "false-20220921151637-3535" exists ...
	I0921 15:26:57.841859   10389 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:26:57.841874   10389 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:26:57.841903   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .DriverName
	I0921 15:26:57.848370   10389 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:53014
	I0921 15:26:57.879837   10389 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I0921 15:26:57.853392   10389 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.25.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.168.64.1 host.minikube.internal\n           fallthrough\n        }' | sudo /var/lib/minikube/binaries/v1.25.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I0921 15:26:57.856801   10389 node_ready.go:35] waiting up to 5m0s for node "false-20220921151637-3535" to be "Ready" ...
	I0921 15:26:57.880708   10389 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:26:57.901652   10389 addons.go:345] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I0921 15:26:57.901674   10389 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I0921 15:26:57.901717   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHHostname
	I0921 15:26:57.902040   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHPort
	I0921 15:26:57.902220   10389 main.go:134] libmachine: Using API Version  1
	I0921 15:26:57.902228   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:57.902244   10389 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:26:57.902481   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHUsername
	I0921 15:26:57.902678   10389 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:26:57.902711   10389 sshutil.go:53] new ssh client: &{IP:192.168.64.30 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/false-20220921151637-3535/id_rsa Username:docker}
	I0921 15:26:57.903323   10389 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:26:57.903348   10389 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:26:57.907923   10389 node_ready.go:49] node "false-20220921151637-3535" has status "Ready":"True"
	I0921 15:26:57.907937   10389 node_ready.go:38] duration metric: took 6.436476ms waiting for node "false-20220921151637-3535" to be "Ready" ...
	I0921 15:26:57.907943   10389 pod_ready.go:35] extra waiting up to 5m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0921 15:26:57.910202   10389 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:53017
	I0921 15:26:57.910546   10389 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:26:57.910873   10389 main.go:134] libmachine: Using API Version  1
	I0921 15:26:57.910889   10389 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:26:57.911076   10389 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:26:57.911170   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetState
	I0921 15:26:57.911256   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0921 15:26:57.911338   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | hyperkit pid from json: 10400
	I0921 15:26:57.912159   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .DriverName
	I0921 15:26:57.912315   10389 addons.go:345] installing /etc/kubernetes/addons/storageclass.yaml
	I0921 15:26:57.912323   10389 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I0921 15:26:57.912331   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHHostname
	I0921 15:26:57.912418   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHPort
	I0921 15:26:57.912497   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:57.912584   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHUsername
	I0921 15:26:57.912659   10389 sshutil.go:53] new ssh client: &{IP:192.168.64.30 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/false-20220921151637-3535/id_rsa Username:docker}
	I0921 15:26:57.919652   10389 pod_ready.go:78] waiting up to 5m0s for pod "coredns-565d847f94-pns2v" in "kube-system" namespace to be "Ready" ...
	I0921 15:26:58.008677   10389 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.2/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I0921 15:26:58.015955   10389 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.2/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I0921 15:26:59.137018   10389 ssh_runner.go:235] Completed: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.25.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.168.64.1 host.minikube.internal\n           fallthrough\n        }' | sudo /var/lib/minikube/binaries/v1.25.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -": (1.235523727s)
	I0921 15:26:59.137048   10389 start.go:810] {"host.minikube.internal": 192.168.64.1} host record injected into CoreDNS
	I0921 15:26:59.214166   10389 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.2/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: (1.198193011s)
	I0921 15:26:59.214197   10389 main.go:134] libmachine: Making call to close driver server
	I0921 15:26:59.214212   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .Close
	I0921 15:26:59.214261   10389 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.2/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: (1.205563718s)
	I0921 15:26:59.214276   10389 main.go:134] libmachine: Making call to close driver server
	I0921 15:26:59.214283   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .Close
	I0921 15:26:59.214398   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | Closing plugin on server side
	I0921 15:26:59.214419   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | Closing plugin on server side
	I0921 15:26:59.214438   10389 main.go:134] libmachine: Successfully made call to close driver server
	I0921 15:26:59.214449   10389 main.go:134] libmachine: Making call to close connection to plugin binary
	I0921 15:26:59.214452   10389 main.go:134] libmachine: Successfully made call to close driver server
	I0921 15:26:59.214458   10389 main.go:134] libmachine: Making call to close driver server
	I0921 15:26:59.214464   10389 main.go:134] libmachine: Making call to close connection to plugin binary
	I0921 15:26:59.214465   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .Close
	I0921 15:26:59.214473   10389 main.go:134] libmachine: Making call to close driver server
	I0921 15:26:59.214483   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .Close
	I0921 15:26:59.214582   10389 main.go:134] libmachine: Successfully made call to close driver server
	I0921 15:26:59.214593   10389 main.go:134] libmachine: Making call to close connection to plugin binary
	I0921 15:26:59.214605   10389 main.go:134] libmachine: Making call to close driver server
	I0921 15:26:59.214615   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .Close
	I0921 15:26:59.214655   10389 main.go:134] libmachine: Successfully made call to close driver server
	I0921 15:26:59.214663   10389 main.go:134] libmachine: Making call to close connection to plugin binary
	I0921 15:26:59.214784   10389 main.go:134] libmachine: Successfully made call to close driver server
	I0921 15:26:59.214810   10389 main.go:134] libmachine: Making call to close connection to plugin binary
	I0921 15:26:59.214847   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | Closing plugin on server side
	I0921 15:26:59.257530   10389 out.go:177] * Enabled addons: storage-provisioner, default-storageclass
	I0921 15:26:56.533116   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:56.635643   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": dial tcp 192.168.64.28:8443: connect: connection refused
	I0921 15:26:56.635670   10408 retry.go:31] will retry after 1.723796097s: state is "Stopped"
	I0921 15:26:58.359704   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:58.461478   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": dial tcp 192.168.64.28:8443: connect: connection refused
	I0921 15:26:58.461505   10408 retry.go:31] will retry after 1.596532639s: state is "Stopped"
	I0921 15:27:00.059136   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:27:00.159945   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": dial tcp 192.168.64.28:8443: connect: connection refused
	I0921 15:27:00.159971   10408 api_server.go:165] Checking apiserver status ...
	I0921 15:27:00.160018   10408 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0921 15:27:00.169632   10408 api_server.go:169] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0921 15:27:00.169647   10408 kubeadm.go:602] needs reconfigure: apiserver error: timed out waiting for the condition
	I0921 15:27:00.169656   10408 kubeadm.go:1114] stopping kube-system containers ...
	I0921 15:27:00.169722   10408 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0921 15:27:00.201882   10408 docker.go:443] Stopping containers: [d7cbc4c453b0 823942ffecb6 283fac289f86 c2e8fe8419a9 4934b6e15931 3a4741e1fe3c e1129956136e 3d0143698c2d 163c82f50ebf 994dd806c8bf eb1318ed7bcc 1a3e01fca571 5fc70456f2e3 54e273754edc 52c58a26f4cc 4ad5f51c22d6 3ac721feff71 bf1833cd9ccb 532325020c06 7d83f8f7d4ba b943e6acece0 25c3a0228e49]
	I0921 15:27:00.201952   10408 ssh_runner.go:195] Run: docker stop d7cbc4c453b0 823942ffecb6 283fac289f86 c2e8fe8419a9 4934b6e15931 3a4741e1fe3c e1129956136e 3d0143698c2d 163c82f50ebf 994dd806c8bf eb1318ed7bcc 1a3e01fca571 5fc70456f2e3 54e273754edc 52c58a26f4cc 4ad5f51c22d6 3ac721feff71 bf1833cd9ccb 532325020c06 7d83f8f7d4ba b943e6acece0 25c3a0228e49
	I0921 15:26:59.279382   10389 addons.go:414] enableAddons completed in 1.538525769s
	I0921 15:26:59.940505   10389 pod_ready.go:102] pod "coredns-565d847f94-pns2v" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:02.438511   10389 pod_ready.go:102] pod "coredns-565d847f94-pns2v" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:05.344188   10408 ssh_runner.go:235] Completed: docker stop d7cbc4c453b0 823942ffecb6 283fac289f86 c2e8fe8419a9 4934b6e15931 3a4741e1fe3c e1129956136e 3d0143698c2d 163c82f50ebf 994dd806c8bf eb1318ed7bcc 1a3e01fca571 5fc70456f2e3 54e273754edc 52c58a26f4cc 4ad5f51c22d6 3ac721feff71 bf1833cd9ccb 532325020c06 7d83f8f7d4ba b943e6acece0 25c3a0228e49: (5.142213633s)
	I0921 15:27:05.344244   10408 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I0921 15:27:05.419551   10408 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I0921 15:27:05.433375   10408 kubeadm.go:155] found existing configuration files:
	-rw------- 1 root root 5643 Sep 21 22:25 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5657 Sep 21 22:25 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 2039 Sep 21 22:25 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5601 Sep 21 22:25 /etc/kubernetes/scheduler.conf
	
	I0921 15:27:05.433432   10408 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I0921 15:27:05.439704   10408 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I0921 15:27:05.445874   10408 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I0921 15:27:05.453215   10408 kubeadm.go:166] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I0921 15:27:05.453270   10408 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I0921 15:27:05.459417   10408 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I0921 15:27:05.465309   10408 kubeadm.go:166] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I0921 15:27:05.465358   10408 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I0921 15:27:05.476008   10408 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I0921 15:27:05.484410   10408 kubeadm.go:704] reconfiguring cluster from /var/tmp/minikube/kubeadm.yaml
	I0921 15:27:05.484426   10408 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.2:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I0921 15:27:05.534434   10408 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.2:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I0921 15:27:04.440960   10389 pod_ready.go:102] pod "coredns-565d847f94-pns2v" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:06.941172   10389 pod_ready.go:102] pod "coredns-565d847f94-pns2v" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:06.469884   10408 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.2:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I0921 15:27:06.628867   10408 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.2:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I0921 15:27:06.698897   10408 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.2:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I0921 15:27:06.759299   10408 api_server.go:51] waiting for apiserver process to appear ...
	I0921 15:27:06.759353   10408 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0921 15:27:06.778540   10408 api_server.go:71] duration metric: took 19.241402ms to wait for apiserver process to appear ...
	I0921 15:27:06.778552   10408 api_server.go:87] waiting for apiserver healthz status ...
	I0921 15:27:06.778559   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:27:09.441803   10389 pod_ready.go:102] pod "coredns-565d847f94-pns2v" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:09.938218   10389 pod_ready.go:97] error getting pod "coredns-565d847f94-pns2v" in "kube-system" namespace (skipping!): pods "coredns-565d847f94-pns2v" not found
	I0921 15:27:09.938237   10389 pod_ready.go:81] duration metric: took 12.018553938s waiting for pod "coredns-565d847f94-pns2v" in "kube-system" namespace to be "Ready" ...
	E0921 15:27:09.938247   10389 pod_ready.go:66] WaitExtra: waitPodCondition: error getting pod "coredns-565d847f94-pns2v" in "kube-system" namespace (skipping!): pods "coredns-565d847f94-pns2v" not found
	I0921 15:27:09.938253   10389 pod_ready.go:78] waiting up to 5m0s for pod "coredns-565d847f94-wwhtk" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:11.950940   10389 pod_ready.go:102] pod "coredns-565d847f94-wwhtk" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:11.780440   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I0921 15:27:12.280518   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:27:14.000183   10408 api_server.go:266] https://192.168.64.28:8443/healthz returned 403:
	{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/healthz\"","reason":"Forbidden","details":{},"code":403}
	W0921 15:27:14.000198   10408 api_server.go:102] status: https://192.168.64.28:8443/healthz returned error 403:
	{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/healthz\"","reason":"Forbidden","details":{},"code":403}
	I0921 15:27:14.282668   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:27:14.289281   10408 api_server.go:266] https://192.168.64.28:8443/healthz returned 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/bootstrap-controller ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	W0921 15:27:14.289293   10408 api_server.go:102] status: https://192.168.64.28:8443/healthz returned error 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/bootstrap-controller ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	I0921 15:27:14.780762   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:27:14.786529   10408 api_server.go:266] https://192.168.64.28:8443/healthz returned 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/bootstrap-controller ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	W0921 15:27:14.786540   10408 api_server.go:102] status: https://192.168.64.28:8443/healthz returned error 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/bootstrap-controller ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	I0921 15:27:15.280930   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:27:15.288106   10408 api_server.go:266] https://192.168.64.28:8443/healthz returned 200:
	ok
	I0921 15:27:15.292969   10408 api_server.go:140] control plane version: v1.25.2
	I0921 15:27:15.292981   10408 api_server.go:130] duration metric: took 8.514415313s to wait for apiserver health ...
	I0921 15:27:15.292986   10408 cni.go:95] Creating CNI manager for ""
	I0921 15:27:15.292994   10408 cni.go:169] CNI unnecessary in this configuration, recommending no CNI
	I0921 15:27:15.293004   10408 system_pods.go:43] waiting for kube-system pods to appear ...
	I0921 15:27:15.298309   10408 system_pods.go:59] 6 kube-system pods found
	I0921 15:27:15.298324   10408 system_pods.go:61] "coredns-565d847f94-9wtnp" [eb8f3bae-6107-4a2b-ba32-d79405830bf0] Running
	I0921 15:27:15.298330   10408 system_pods.go:61] "etcd-pause-20220921152522-3535" [17c2d77b-b921-47a8-9a13-17620d5b88c8] Running / Ready:ContainersNotReady (containers with unready status: [etcd]) / ContainersReady:ContainersNotReady (containers with unready status: [etcd])
	I0921 15:27:15.298335   10408 system_pods.go:61] "kube-apiserver-pause-20220921152522-3535" [0e89e308-e699-430a-9feb-d0b972291f03] Running / Ready:ContainersNotReady (containers with unready status: [kube-apiserver]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-apiserver])
	I0921 15:27:15.298340   10408 system_pods.go:61] "kube-controller-manager-pause-20220921152522-3535" [1e9f7576-ef69-4d06-b19d-0cf5fb9d0471] Running / Ready:ContainersNotReady (containers with unready status: [kube-controller-manager]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-controller-manager])
	I0921 15:27:15.298344   10408 system_pods.go:61] "kube-proxy-5c7jc" [1c5b06ea-f4c2-45b9-a80e-d85983bb3282] Running
	I0921 15:27:15.298348   10408 system_pods.go:61] "kube-scheduler-pause-20220921152522-3535" [cb32a64b-32f0-46e6-8f1c-f2a3460c5fbb] Running / Ready:ContainersNotReady (containers with unready status: [kube-scheduler]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-scheduler])
	I0921 15:27:15.298352   10408 system_pods.go:74] duration metric: took 5.344262ms to wait for pod list to return data ...
	I0921 15:27:15.298357   10408 node_conditions.go:102] verifying NodePressure condition ...
	I0921 15:27:15.300304   10408 node_conditions.go:122] node storage ephemeral capacity is 17784752Ki
	I0921 15:27:15.300319   10408 node_conditions.go:123] node cpu capacity is 2
	I0921 15:27:15.300328   10408 node_conditions.go:105] duration metric: took 1.967816ms to run NodePressure ...
	I0921 15:27:15.300342   10408 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.2:$PATH" kubeadm init phase addon all --config /var/tmp/minikube/kubeadm.yaml"
	I0921 15:27:15.402185   10408 kubeadm.go:763] waiting for restarted kubelet to initialise ...
	I0921 15:27:15.405062   10408 kubeadm.go:778] kubelet initialised
	I0921 15:27:15.405072   10408 kubeadm.go:779] duration metric: took 2.873657ms waiting for restarted kubelet to initialise ...
	I0921 15:27:15.405080   10408 pod_ready.go:35] extra waiting up to 4m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0921 15:27:15.408132   10408 pod_ready.go:78] waiting up to 4m0s for pod "coredns-565d847f94-9wtnp" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:15.411452   10408 pod_ready.go:92] pod "coredns-565d847f94-9wtnp" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:15.411459   10408 pod_ready.go:81] duration metric: took 3.317632ms waiting for pod "coredns-565d847f94-9wtnp" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:15.411465   10408 pod_ready.go:78] waiting up to 4m0s for pod "etcd-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:14.445892   10389 pod_ready.go:102] pod "coredns-565d847f94-wwhtk" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:16.945831   10389 pod_ready.go:102] pod "coredns-565d847f94-wwhtk" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:17.420289   10408 pod_ready.go:102] pod "etcd-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:19.421503   10408 pod_ready.go:102] pod "etcd-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:18.946719   10389 pod_ready.go:102] pod "coredns-565d847f94-wwhtk" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:20.947256   10389 pod_ready.go:102] pod "coredns-565d847f94-wwhtk" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:22.950309   10389 pod_ready.go:102] pod "coredns-565d847f94-wwhtk" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:21.919889   10408 pod_ready.go:102] pod "etcd-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:24.419226   10408 pod_ready.go:102] pod "etcd-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:25.920028   10408 pod_ready.go:92] pod "etcd-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:25.920043   10408 pod_ready.go:81] duration metric: took 10.508561161s waiting for pod "etcd-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:25.920049   10408 pod_ready.go:78] waiting up to 4m0s for pod "kube-apiserver-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:25.923063   10408 pod_ready.go:92] pod "kube-apiserver-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:25.923071   10408 pod_ready.go:81] duration metric: took 3.017613ms waiting for pod "kube-apiserver-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:25.923077   10408 pod_ready.go:78] waiting up to 4m0s for pod "kube-controller-manager-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:25.926284   10408 pod_ready.go:92] pod "kube-controller-manager-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:25.926292   10408 pod_ready.go:81] duration metric: took 3.20987ms waiting for pod "kube-controller-manager-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:25.926297   10408 pod_ready.go:78] waiting up to 4m0s for pod "kube-proxy-5c7jc" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:25.929448   10408 pod_ready.go:92] pod "kube-proxy-5c7jc" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:25.929456   10408 pod_ready.go:81] duration metric: took 3.154194ms waiting for pod "kube-proxy-5c7jc" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:25.929461   10408 pod_ready.go:78] waiting up to 4m0s for pod "kube-scheduler-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:25.932599   10408 pod_ready.go:92] pod "kube-scheduler-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:25.932606   10408 pod_ready.go:81] duration metric: took 3.140486ms waiting for pod "kube-scheduler-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:25.932610   10408 pod_ready.go:38] duration metric: took 10.527510396s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0921 15:27:25.932619   10408 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0921 15:27:25.939997   10408 ops.go:34] apiserver oom_adj: -16
	I0921 15:27:25.940008   10408 kubeadm.go:631] restartCluster took 55.116747244s
	I0921 15:27:25.940013   10408 kubeadm.go:398] StartCluster complete in 55.154103553s
	I0921 15:27:25.940027   10408 settings.go:142] acquiring lock: {Name:mkb00f1de0b91d8f67bd982eab088d27845674b9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0921 15:27:25.940102   10408 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/kubeconfig
	I0921 15:27:25.941204   10408 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/kubeconfig: {Name:mka2f83e1cbd4124ff7179732fbb172d977cf2f4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0921 15:27:25.942042   10408 kapi.go:59] client config for pause-20220921152522-3535: &rest.Config{Host:"https://192.168.64.28:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/client.crt", KeyFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/cl
ient.key", CAFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x233b400), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0921 15:27:25.944188   10408 kapi.go:244] deployment "coredns" in namespace "kube-system" and context "pause-20220921152522-3535" rescaled to 1
	I0921 15:27:25.944221   10408 start.go:211] Will wait 6m0s for node &{Name: IP:192.168.64.28 Port:8443 KubernetesVersion:v1.25.2 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0921 15:27:25.944255   10408 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.25.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I0921 15:27:25.944277   10408 addons.go:412] enableAddons start: toEnable=map[], additional=[]
	I0921 15:27:25.944378   10408 config.go:180] Loaded profile config "pause-20220921152522-3535": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.2
	I0921 15:27:25.967437   10408 addons.go:65] Setting storage-provisioner=true in profile "pause-20220921152522-3535"
	I0921 15:27:25.967440   10408 addons.go:65] Setting default-storageclass=true in profile "pause-20220921152522-3535"
	I0921 15:27:25.967359   10408 out.go:177] * Verifying Kubernetes components...
	I0921 15:27:25.967453   10408 addons.go:153] Setting addon storage-provisioner=true in "pause-20220921152522-3535"
	I0921 15:27:25.967457   10408 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "pause-20220921152522-3535"
	W0921 15:27:25.967460   10408 addons.go:162] addon storage-provisioner should already be in state true
	I0921 15:27:26.012377   10408 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0921 15:27:26.012436   10408 host.go:66] Checking if "pause-20220921152522-3535" exists ...
	I0921 15:27:26.012762   10408 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:27:26.012761   10408 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:27:26.012794   10408 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:27:26.012829   10408 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:27:26.019897   10408 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:53028
	I0921 15:27:26.020028   10408 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:53029
	I0921 15:27:26.020328   10408 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:27:26.020394   10408 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:27:26.020706   10408 main.go:134] libmachine: Using API Version  1
	I0921 15:27:26.020719   10408 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:27:26.020801   10408 main.go:134] libmachine: Using API Version  1
	I0921 15:27:26.020817   10408 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:27:26.020929   10408 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:27:26.021015   10408 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:27:26.021115   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetState
	I0921 15:27:26.021203   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0921 15:27:26.021283   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | hyperkit pid from json: 10295
	I0921 15:27:26.021419   10408 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:27:26.021443   10408 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:27:26.023750   10408 kapi.go:59] client config for pause-20220921152522-3535: &rest.Config{Host:"https://192.168.64.28:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/client.crt", KeyFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/cl
ient.key", CAFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x233b400), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0921 15:27:26.027574   10408 addons.go:153] Setting addon default-storageclass=true in "pause-20220921152522-3535"
	W0921 15:27:26.027587   10408 addons.go:162] addon default-storageclass should already be in state true
	I0921 15:27:26.027606   10408 host.go:66] Checking if "pause-20220921152522-3535" exists ...
	I0921 15:27:26.027788   10408 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:53032
	I0921 15:27:26.027854   10408 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:27:26.027880   10408 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:27:26.028560   10408 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:27:26.029753   10408 main.go:134] libmachine: Using API Version  1
	I0921 15:27:26.029767   10408 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:27:26.030003   10408 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:27:26.030113   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetState
	I0921 15:27:26.030207   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0921 15:27:26.030282   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | hyperkit pid from json: 10295
	I0921 15:27:26.031135   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:27:26.034331   10408 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:53034
	I0921 15:27:26.055199   10408 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I0921 15:27:26.038435   10408 node_ready.go:35] waiting up to 6m0s for node "pause-20220921152522-3535" to be "Ready" ...
	I0921 15:27:26.038466   10408 start.go:790] CoreDNS already contains "host.minikube.internal" host record, skipping...
	I0921 15:27:26.055642   10408 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:27:26.075151   10408 addons.go:345] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I0921 15:27:26.075161   10408 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I0921 15:27:26.075184   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:27:26.075306   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:27:26.075441   10408 main.go:134] libmachine: Using API Version  1
	I0921 15:27:26.075451   10408 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:27:26.075455   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:27:26.075546   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:27:26.075643   10408 sshutil.go:53] new ssh client: &{IP:192.168.64.28 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/pause-20220921152522-3535/id_rsa Username:docker}
	I0921 15:27:26.075669   10408 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:27:26.076075   10408 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:27:26.076097   10408 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:27:26.082485   10408 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:53037
	I0921 15:27:26.082858   10408 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:27:26.083217   10408 main.go:134] libmachine: Using API Version  1
	I0921 15:27:26.083234   10408 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:27:26.083443   10408 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:27:26.083534   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetState
	I0921 15:27:26.083608   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0921 15:27:26.083699   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | hyperkit pid from json: 10295
	I0921 15:27:26.084503   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:27:26.084648   10408 addons.go:345] installing /etc/kubernetes/addons/storageclass.yaml
	I0921 15:27:26.084657   10408 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I0921 15:27:26.084665   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:27:26.084734   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:27:26.084830   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:27:26.084916   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:27:26.085010   10408 sshutil.go:53] new ssh client: &{IP:192.168.64.28 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/pause-20220921152522-3535/id_rsa Username:docker}
	I0921 15:27:26.117393   10408 node_ready.go:49] node "pause-20220921152522-3535" has status "Ready":"True"
	I0921 15:27:26.117403   10408 node_ready.go:38] duration metric: took 42.373374ms waiting for node "pause-20220921152522-3535" to be "Ready" ...
	I0921 15:27:26.117410   10408 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0921 15:27:26.127239   10408 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.2/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I0921 15:27:26.137634   10408 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.2/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I0921 15:27:26.319821   10408 pod_ready.go:78] waiting up to 6m0s for pod "coredns-565d847f94-9wtnp" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:26.697611   10408 main.go:134] libmachine: Making call to close driver server
	I0921 15:27:26.697627   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .Close
	I0921 15:27:26.697784   10408 main.go:134] libmachine: Successfully made call to close driver server
	I0921 15:27:26.697793   10408 main.go:134] libmachine: Making call to close connection to plugin binary
	I0921 15:27:26.697804   10408 main.go:134] libmachine: Making call to close driver server
	I0921 15:27:26.697809   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .Close
	I0921 15:27:26.697836   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | Closing plugin on server side
	I0921 15:27:26.697938   10408 main.go:134] libmachine: Successfully made call to close driver server
	I0921 15:27:26.697946   10408 main.go:134] libmachine: Making call to close connection to plugin binary
	I0921 15:27:26.697962   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | Closing plugin on server side
	I0921 15:27:26.712622   10408 main.go:134] libmachine: Making call to close driver server
	I0921 15:27:26.712636   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .Close
	I0921 15:27:26.712825   10408 main.go:134] libmachine: Successfully made call to close driver server
	I0921 15:27:26.712834   10408 main.go:134] libmachine: Making call to close connection to plugin binary
	I0921 15:27:26.712839   10408 main.go:134] libmachine: Making call to close driver server
	I0921 15:27:26.712844   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | Closing plugin on server side
	I0921 15:27:26.712846   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .Close
	I0921 15:27:26.712954   10408 main.go:134] libmachine: Successfully made call to close driver server
	I0921 15:27:26.712962   10408 main.go:134] libmachine: Making call to close connection to plugin binary
	I0921 15:27:26.712969   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | Closing plugin on server side
	I0921 15:27:26.712973   10408 main.go:134] libmachine: Making call to close driver server
	I0921 15:27:26.712981   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .Close
	I0921 15:27:26.713114   10408 main.go:134] libmachine: Successfully made call to close driver server
	I0921 15:27:26.713128   10408 main.go:134] libmachine: Making call to close connection to plugin binary
	I0921 15:27:26.713142   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | Closing plugin on server side
	I0921 15:27:26.735926   10408 out.go:177] * Enabled addons: storage-provisioner, default-storageclass
	I0921 15:27:25.446939   10389 pod_ready.go:102] pod "coredns-565d847f94-wwhtk" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:27.947781   10389 pod_ready.go:102] pod "coredns-565d847f94-wwhtk" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:26.773142   10408 addons.go:414] enableAddons completed in 828.831417ms
	I0921 15:27:26.776027   10408 pod_ready.go:92] pod "coredns-565d847f94-9wtnp" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:26.776040   10408 pod_ready.go:81] duration metric: took 456.205251ms waiting for pod "coredns-565d847f94-9wtnp" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:26.776049   10408 pod_ready.go:78] waiting up to 6m0s for pod "etcd-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:27.117622   10408 pod_ready.go:92] pod "etcd-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:27.117632   10408 pod_ready.go:81] duration metric: took 341.577773ms waiting for pod "etcd-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:27.117638   10408 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:27.518637   10408 pod_ready.go:92] pod "kube-apiserver-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:27.518650   10408 pod_ready.go:81] duration metric: took 401.006674ms waiting for pod "kube-apiserver-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:27.518660   10408 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:27.918763   10408 pod_ready.go:92] pod "kube-controller-manager-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:27.918778   10408 pod_ready.go:81] duration metric: took 400.10892ms waiting for pod "kube-controller-manager-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:27.918787   10408 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-5c7jc" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:28.318657   10408 pod_ready.go:92] pod "kube-proxy-5c7jc" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:28.318670   10408 pod_ready.go:81] duration metric: took 399.877205ms waiting for pod "kube-proxy-5c7jc" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:28.318678   10408 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:28.720230   10408 pod_ready.go:92] pod "kube-scheduler-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:28.720243   10408 pod_ready.go:81] duration metric: took 401.55845ms waiting for pod "kube-scheduler-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:28.720250   10408 pod_ready.go:38] duration metric: took 2.602830576s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0921 15:27:28.720263   10408 api_server.go:51] waiting for apiserver process to appear ...
	I0921 15:27:28.720316   10408 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0921 15:27:28.729887   10408 api_server.go:71] duration metric: took 2.78564504s to wait for apiserver process to appear ...
	I0921 15:27:28.729899   10408 api_server.go:87] waiting for apiserver healthz status ...
	I0921 15:27:28.729905   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:27:28.733744   10408 api_server.go:266] https://192.168.64.28:8443/healthz returned 200:
	ok
	I0921 15:27:28.734313   10408 api_server.go:140] control plane version: v1.25.2
	I0921 15:27:28.734323   10408 api_server.go:130] duration metric: took 4.419338ms to wait for apiserver health ...
	I0921 15:27:28.734328   10408 system_pods.go:43] waiting for kube-system pods to appear ...
	I0921 15:27:28.920241   10408 system_pods.go:59] 7 kube-system pods found
	I0921 15:27:28.920257   10408 system_pods.go:61] "coredns-565d847f94-9wtnp" [eb8f3bae-6107-4a2b-ba32-d79405830bf0] Running
	I0921 15:27:28.920261   10408 system_pods.go:61] "etcd-pause-20220921152522-3535" [17c2d77b-b921-47a8-9a13-17620d5b88c8] Running
	I0921 15:27:28.920274   10408 system_pods.go:61] "kube-apiserver-pause-20220921152522-3535" [0e89e308-e699-430a-9feb-d0b972291f03] Running
	I0921 15:27:28.920279   10408 system_pods.go:61] "kube-controller-manager-pause-20220921152522-3535" [1e9f7576-ef69-4d06-b19d-0cf5fb9d0471] Running
	I0921 15:27:28.920283   10408 system_pods.go:61] "kube-proxy-5c7jc" [1c5b06ea-f4c2-45b9-a80e-d85983bb3282] Running
	I0921 15:27:28.920286   10408 system_pods.go:61] "kube-scheduler-pause-20220921152522-3535" [cb32a64b-32f0-46e6-8f1c-f2a3460c5fbb] Running
	I0921 15:27:28.920289   10408 system_pods.go:61] "storage-provisioner" [f71f00f0-f421-45c2-bfe4-c1e99f11b8e5] Running
	I0921 15:27:28.920294   10408 system_pods.go:74] duration metric: took 185.961163ms to wait for pod list to return data ...
	I0921 15:27:28.920300   10408 default_sa.go:34] waiting for default service account to be created ...
	I0921 15:27:29.119704   10408 default_sa.go:45] found service account: "default"
	I0921 15:27:29.119720   10408 default_sa.go:55] duration metric: took 199.41576ms for default service account to be created ...
	I0921 15:27:29.119727   10408 system_pods.go:116] waiting for k8s-apps to be running ...
	I0921 15:27:29.322362   10408 system_pods.go:86] 7 kube-system pods found
	I0921 15:27:29.322375   10408 system_pods.go:89] "coredns-565d847f94-9wtnp" [eb8f3bae-6107-4a2b-ba32-d79405830bf0] Running
	I0921 15:27:29.322379   10408 system_pods.go:89] "etcd-pause-20220921152522-3535" [17c2d77b-b921-47a8-9a13-17620d5b88c8] Running
	I0921 15:27:29.322383   10408 system_pods.go:89] "kube-apiserver-pause-20220921152522-3535" [0e89e308-e699-430a-9feb-d0b972291f03] Running
	I0921 15:27:29.322388   10408 system_pods.go:89] "kube-controller-manager-pause-20220921152522-3535" [1e9f7576-ef69-4d06-b19d-0cf5fb9d0471] Running
	I0921 15:27:29.322391   10408 system_pods.go:89] "kube-proxy-5c7jc" [1c5b06ea-f4c2-45b9-a80e-d85983bb3282] Running
	I0921 15:27:29.322395   10408 system_pods.go:89] "kube-scheduler-pause-20220921152522-3535" [cb32a64b-32f0-46e6-8f1c-f2a3460c5fbb] Running
	I0921 15:27:29.322398   10408 system_pods.go:89] "storage-provisioner" [f71f00f0-f421-45c2-bfe4-c1e99f11b8e5] Running
	I0921 15:27:29.322402   10408 system_pods.go:126] duration metric: took 202.671392ms to wait for k8s-apps to be running ...
	I0921 15:27:29.322407   10408 system_svc.go:44] waiting for kubelet service to be running ....
	I0921 15:27:29.322452   10408 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0921 15:27:29.331792   10408 system_svc.go:56] duration metric: took 9.381149ms WaitForService to wait for kubelet.
	I0921 15:27:29.331804   10408 kubeadm.go:573] duration metric: took 3.387565971s to wait for : map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] ...
	I0921 15:27:29.331823   10408 node_conditions.go:102] verifying NodePressure condition ...
	I0921 15:27:29.518084   10408 node_conditions.go:122] node storage ephemeral capacity is 17784752Ki
	I0921 15:27:29.518100   10408 node_conditions.go:123] node cpu capacity is 2
	I0921 15:27:29.518105   10408 node_conditions.go:105] duration metric: took 186.278888ms to run NodePressure ...
	I0921 15:27:29.518113   10408 start.go:216] waiting for startup goroutines ...
	I0921 15:27:29.551427   10408 start.go:506] kubectl: 1.25.0, cluster: 1.25.2 (minor skew: 0)
	I0921 15:27:29.611327   10408 out.go:177] * Done! kubectl is now configured to use "pause-20220921152522-3535" cluster and "default" namespace by default
	
	* 
	* ==> Docker <==
	* -- Journal begins at Wed 2022-09-21 22:25:29 UTC, ends at Wed 2022-09-21 22:27:30 UTC. --
	Sep 21 22:27:07 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:07.405457988Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/64651e97bf148aa1e9fbcad6bfbec4d1e8535ad920f0d5c47cd57190f6804445 pid=5990 runtime=io.containerd.runc.v2
	Sep 21 22:27:07 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:07.406210133Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Sep 21 22:27:07 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:07.406245445Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Sep 21 22:27:07 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:07.406253448Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Sep 21 22:27:07 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:07.406435610Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/207eee071672f5cc181475db6e621afacd6722bc026b03a3b344ad50e1cefc78 pid=5992 runtime=io.containerd.runc.v2
	Sep 21 22:27:07 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:07.422862395Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Sep 21 22:27:07 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:07.422958571Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Sep 21 22:27:07 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:07.422967730Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Sep 21 22:27:07 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:07.423253250Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/534b0d7cd88d7c2d979cc7e5c6eb29977494de71ff82fec3d02420ecb80a30b9 pid=6024 runtime=io.containerd.runc.v2
	Sep 21 22:27:15 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:15.785293775Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Sep 21 22:27:15 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:15.785363542Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Sep 21 22:27:15 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:15.785372748Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Sep 21 22:27:15 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:15.785536470Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/1650473a18ef5642e63da9873326d2ed8d331ce75d182aaf5834afe35d8f1c48 pid=6217 runtime=io.containerd.runc.v2
	Sep 21 22:27:16 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:16.098886881Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Sep 21 22:27:16 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:16.098975354Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Sep 21 22:27:16 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:16.098986289Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Sep 21 22:27:16 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:16.099142849Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/152338a53f1e4e1033c391833e8d6cba34a8c41caa549b9524e155354c7edd68 pid=6265 runtime=io.containerd.runc.v2
	Sep 21 22:27:27 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:27.192601808Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Sep 21 22:27:27 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:27.192670528Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Sep 21 22:27:27 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:27.192679056Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Sep 21 22:27:27 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:27.192948353Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/c41fc7d463dbce833eb22fe2cbe7272c863767af9f5ce4eb37b36c8efa33b012 pid=6532 runtime=io.containerd.runc.v2
	Sep 21 22:27:27 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:27.493268572Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Sep 21 22:27:27 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:27.493331709Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Sep 21 22:27:27 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:27.493341289Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Sep 21 22:27:27 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:27.493781950Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/e6a3aeef0ff7cec28ea93bae81a53252f4adbfe81f9da2e64add46df53fa77f2 pid=6573 runtime=io.containerd.runc.v2
	
	* 
	* ==> container status <==
	* CONTAINER           IMAGE               CREATED              STATE               NAME                      ATTEMPT             POD ID
	e6a3aeef0ff7c       6e38f40d628db       3 seconds ago        Running             storage-provisioner       0                   c41fc7d463dbc
	152338a53f1e4       1c7d8c51823b5       14 seconds ago       Running             kube-proxy                3                   f67bd5c5d43e1
	1650473a18ef5       5185b96f0becf       15 seconds ago       Running             coredns                   2                   92cc25df1c118
	64651e97bf148       a8a176a5d5d69       23 seconds ago       Running             etcd                      3                   0249ca0da9611
	207eee071672f       ca0ea1ee3cfd3       23 seconds ago       Running             kube-scheduler            3                   522a493620409
	534b0d7cd88d7       dbfceb93c69b6       23 seconds ago       Running             kube-controller-manager   3                   f60c5ce6318fc
	b6d4531497f33       97801f8394908       28 seconds ago       Running             kube-apiserver            3                   0ca250926532e
	d7cbc4c453b05       ca0ea1ee3cfd3       39 seconds ago       Exited              kube-scheduler            2                   1a3e01fca5715
	823942ffecb6f       dbfceb93c69b6       42 seconds ago       Exited              kube-controller-manager   2                   e1129956136e0
	283fac289f860       a8a176a5d5d69       43 seconds ago       Exited              etcd                      2                   eb1318ed7bcc9
	c2e8fe8419a96       1c7d8c51823b5       44 seconds ago       Exited              kube-proxy                2                   994dd806c8bfd
	4934b6e15931f       5185b96f0becf       About a minute ago   Exited              coredns                   1                   163c82f50ebf1
	3a4741e1fe3c0       97801f8394908       About a minute ago   Exited              kube-apiserver            2                   3d0143698c2dc
	
	* 
	* ==> coredns [1650473a18ef] <==
	* .:53
	[INFO] plugin/reload: Running configuration SHA512 = 7135f430aea492809ab227b028bd16c96f6629e00404d9ec4f44cae029eb3743d1cfe4a9d0cc8fbbd4cfa53556972f2bbf615e7c9e8412e85d290539257166ad
	CoreDNS-1.9.3
	linux/amd64, go1.18.2, 45b0a11
	
	* 
	* ==> coredns [4934b6e15931] <==
	* [INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 7135f430aea492809ab227b028bd16c96f6629e00404d9ec4f44cae029eb3743d1cfe4a9d0cc8fbbd4cfa53556972f2bbf615e7c9e8412e85d290539257166ad
	CoreDNS-1.9.3
	linux/amd64, go1.18.2, 45b0a11
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[WARNING] plugin/kubernetes: Kubernetes API connection failure: Get "https://10.96.0.1:443/version": net/http: TLS handshake timeout
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	[WARNING] plugin/kubernetes: Kubernetes API connection failure: Get "https://10.96.0.1:443/version": dial tcp 10.96.0.1:443: connect: network is unreachable
	
	* 
	* ==> describe nodes <==
	* Name:               pause-20220921152522-3535
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=pause-20220921152522-3535
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=937c68716dfaac5b5ffa3b6655158d5d3472b8c4
	                    minikube.k8s.io/name=pause-20220921152522-3535
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2022_09_21T15_25_59_0700
	                    minikube.k8s.io/version=v1.27.0
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 21 Sep 2022 22:25:58 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  pause-20220921152522-3535
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 21 Sep 2022 22:27:24 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 21 Sep 2022 22:27:14 +0000   Wed, 21 Sep 2022 22:25:58 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 21 Sep 2022 22:27:14 +0000   Wed, 21 Sep 2022 22:25:58 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 21 Sep 2022 22:27:14 +0000   Wed, 21 Sep 2022 22:25:58 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 21 Sep 2022 22:27:14 +0000   Wed, 21 Sep 2022 22:26:09 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.64.28
	  Hostname:    pause-20220921152522-3535
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17784752Ki
	  hugepages-2Mi:      0
	  memory:             2017572Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17784752Ki
	  hugepages-2Mi:      0
	  memory:             2017572Ki
	  pods:               110
	System Info:
	  Machine ID:                 0962272db386446fb19d5815e48c70e2
	  System UUID:                485511ed-0000-0000-82c9-149d997fca88
	  Boot ID:                    e52786ed-2040-47a8-9190-c9c808b4a98b
	  Kernel Version:             5.10.57
	  OS Image:                   Buildroot 2021.02.12
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://20.10.18
	  Kubelet Version:            v1.25.2
	  Kube-Proxy Version:         v1.25.2
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (7 in total)
	  Namespace                   Name                                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                                 ------------  ----------  ---------------  -------------  ---
	  kube-system                 coredns-565d847f94-9wtnp                             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     80s
	  kube-system                 etcd-pause-20220921152522-3535                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (5%!)(MISSING)       0 (0%!)(MISSING)         92s
	  kube-system                 kube-apiserver-pause-20220921152522-3535             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         92s
	  kube-system                 kube-controller-manager-pause-20220921152522-3535    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         92s
	  kube-system                 kube-proxy-5c7jc                                     0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         80s
	  kube-system                 kube-scheduler-pause-20220921152522-3535             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         92s
	  kube-system                 storage-provisioner                                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         5s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  0 (0%!)(MISSING)
	  memory             170Mi (8%!)(MISSING)  170Mi (8%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                  From             Message
	  ----    ------                   ----                 ----             -------
	  Normal  Starting                 78s                  kube-proxy       
	  Normal  Starting                 14s                  kube-proxy       
	  Normal  Starting                 63s                  kube-proxy       
	  Normal  NodeHasSufficientPID     106s (x5 over 106s)  kubelet          Node pause-20220921152522-3535 status is now: NodeHasSufficientPID
	  Normal  NodeHasNoDiskPressure    106s (x6 over 106s)  kubelet          Node pause-20220921152522-3535 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientMemory  106s (x6 over 106s)  kubelet          Node pause-20220921152522-3535 status is now: NodeHasSufficientMemory
	  Normal  Starting                 92s                  kubelet          Starting kubelet.
	  Normal  NodeAllocatableEnforced  92s                  kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientMemory  92s                  kubelet          Node pause-20220921152522-3535 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    92s                  kubelet          Node pause-20220921152522-3535 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     92s                  kubelet          Node pause-20220921152522-3535 status is now: NodeHasSufficientPID
	  Normal  NodeReady                82s                  kubelet          Node pause-20220921152522-3535 status is now: NodeReady
	  Normal  RegisteredNode           80s                  node-controller  Node pause-20220921152522-3535 event: Registered Node pause-20220921152522-3535 in Controller
	  Normal  Starting                 25s                  kubelet          Starting kubelet.
	  Normal  NodeAllocatableEnforced  25s                  kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientMemory  24s (x8 over 25s)    kubelet          Node pause-20220921152522-3535 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    24s (x8 over 25s)    kubelet          Node pause-20220921152522-3535 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     24s (x7 over 25s)    kubelet          Node pause-20220921152522-3535 status is now: NodeHasSufficientPID
	  Normal  RegisteredNode           4s                   node-controller  Node pause-20220921152522-3535 event: Registered Node pause-20220921152522-3535 in Controller
	
	* 
	* ==> dmesg <==
	* [  +0.000001] systemd[1]: (This warning is only shown for the first unit using IP firewalling.)
	[  +1.836758] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000004] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000000] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +0.731337] systemd-fstab-generator[530]: Ignoring "noauto" for root device
	[  +0.090984] systemd-fstab-generator[541]: Ignoring "noauto" for root device
	[  +5.027202] systemd-fstab-generator[762]: Ignoring "noauto" for root device
	[  +1.197234] kauditd_printk_skb: 16 callbacks suppressed
	[  +0.214769] systemd-fstab-generator[921]: Ignoring "noauto" for root device
	[  +0.091300] systemd-fstab-generator[932]: Ignoring "noauto" for root device
	[  +0.097321] systemd-fstab-generator[943]: Ignoring "noauto" for root device
	[  +1.296604] systemd-fstab-generator[1093]: Ignoring "noauto" for root device
	[  +0.087737] systemd-fstab-generator[1104]: Ignoring "noauto" for root device
	[  +3.910315] systemd-fstab-generator[1322]: Ignoring "noauto" for root device
	[  +0.546371] kauditd_printk_skb: 68 callbacks suppressed
	[ +13.692006] systemd-fstab-generator[1995]: Ignoring "noauto" for root device
	[Sep21 22:26] kauditd_printk_skb: 8 callbacks suppressed
	[  +8.344097] systemd-fstab-generator[2768]: Ignoring "noauto" for root device
	[  +0.136976] systemd-fstab-generator[2779]: Ignoring "noauto" for root device
	[  +0.134278] systemd-fstab-generator[2790]: Ignoring "noauto" for root device
	[  +0.497533] kauditd_printk_skb: 17 callbacks suppressed
	[  +7.690771] systemd-fstab-generator[4167]: Ignoring "noauto" for root device
	[  +0.127432] systemd-fstab-generator[4182]: Ignoring "noauto" for root device
	[ +31.144308] kauditd_printk_skb: 34 callbacks suppressed
	[Sep21 22:27] systemd-fstab-generator[5830]: Ignoring "noauto" for root device
	
	* 
	* ==> etcd [283fac289f86] <==
	* {"level":"info","ts":"2022-09-21T22:26:47.976Z","caller":"embed/etcd.go:763","msg":"serving metrics","address":"http://127.0.0.1:2381"}
	{"level":"info","ts":"2022-09-21T22:26:47.976Z","caller":"embed/etcd.go:581","msg":"serving peer traffic","address":"192.168.64.28:2380"}
	{"level":"info","ts":"2022-09-21T22:26:47.976Z","caller":"embed/etcd.go:553","msg":"cmux::serve","address":"192.168.64.28:2380"}
	{"level":"info","ts":"2022-09-21T22:26:49.366Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"d3378a43e4252963 is starting a new election at term 3"}
	{"level":"info","ts":"2022-09-21T22:26:49.367Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"d3378a43e4252963 became pre-candidate at term 3"}
	{"level":"info","ts":"2022-09-21T22:26:49.367Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"d3378a43e4252963 received MsgPreVoteResp from d3378a43e4252963 at term 3"}
	{"level":"info","ts":"2022-09-21T22:26:49.367Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"d3378a43e4252963 became candidate at term 4"}
	{"level":"info","ts":"2022-09-21T22:26:49.367Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"d3378a43e4252963 received MsgVoteResp from d3378a43e4252963 at term 4"}
	{"level":"info","ts":"2022-09-21T22:26:49.367Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"d3378a43e4252963 became leader at term 4"}
	{"level":"info","ts":"2022-09-21T22:26:49.367Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: d3378a43e4252963 elected leader d3378a43e4252963 at term 4"}
	{"level":"info","ts":"2022-09-21T22:26:49.367Z","caller":"etcdserver/server.go:2042","msg":"published local member to cluster through raft","local-member-id":"d3378a43e4252963","local-member-attributes":"{Name:pause-20220921152522-3535 ClientURLs:[https://192.168.64.28:2379]}","request-path":"/0/members/d3378a43e4252963/attributes","cluster-id":"e703c3abd1a7846","publish-timeout":"7s"}
	{"level":"info","ts":"2022-09-21T22:26:49.367Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
	{"level":"info","ts":"2022-09-21T22:26:49.367Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
	{"level":"info","ts":"2022-09-21T22:26:49.368Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2022-09-21T22:26:49.370Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"192.168.64.28:2379"}
	{"level":"info","ts":"2022-09-21T22:26:49.375Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2022-09-21T22:26:49.376Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"info","ts":"2022-09-21T22:27:00.388Z","caller":"osutil/interrupt_unix.go:64","msg":"received signal; shutting down","signal":"terminated"}
	{"level":"info","ts":"2022-09-21T22:27:00.388Z","caller":"embed/etcd.go:368","msg":"closing etcd server","name":"pause-20220921152522-3535","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.64.28:2380"],"advertise-client-urls":["https://192.168.64.28:2379"]}
	WARNING: 2022/09/21 22:27:00 [core] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 127.0.0.1:2379 <nil> 0 <nil>}. Err: connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
	WARNING: 2022/09/21 22:27:00 [core] grpc: addrConn.createTransport failed to connect to {192.168.64.28:2379 192.168.64.28:2379 <nil> 0 <nil>}. Err: connection error: desc = "transport: Error while dialing dial tcp 192.168.64.28:2379: connect: connection refused". Reconnecting...
	{"level":"info","ts":"2022-09-21T22:27:00.391Z","caller":"etcdserver/server.go:1453","msg":"skipped leadership transfer for single voting member cluster","local-member-id":"d3378a43e4252963","current-leader-member-id":"d3378a43e4252963"}
	{"level":"info","ts":"2022-09-21T22:27:00.392Z","caller":"embed/etcd.go:563","msg":"stopping serving peer traffic","address":"192.168.64.28:2380"}
	{"level":"info","ts":"2022-09-21T22:27:00.394Z","caller":"embed/etcd.go:568","msg":"stopped serving peer traffic","address":"192.168.64.28:2380"}
	{"level":"info","ts":"2022-09-21T22:27:00.394Z","caller":"embed/etcd.go:370","msg":"closed etcd server","name":"pause-20220921152522-3535","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.64.28:2380"],"advertise-client-urls":["https://192.168.64.28:2379"]}
	
	* 
	* ==> etcd [64651e97bf14] <==
	* {"level":"info","ts":"2022-09-21T22:27:08.280Z","caller":"etcdserver/server.go:851","msg":"starting etcd server","local-member-id":"d3378a43e4252963","local-server-version":"3.5.4","cluster-version":"to_be_decided"}
	{"level":"info","ts":"2022-09-21T22:27:08.282Z","caller":"embed/etcd.go:688","msg":"starting with client TLS","tls-info":"cert = /var/lib/minikube/certs/etcd/server.crt, key = /var/lib/minikube/certs/etcd/server.key, client-cert=, client-key=, trusted-ca = /var/lib/minikube/certs/etcd/ca.crt, client-cert-auth = true, crl-file = ","cipher-suites":[]}
	{"level":"info","ts":"2022-09-21T22:27:08.283Z","caller":"embed/etcd.go:277","msg":"now serving peer/client/metrics","local-member-id":"d3378a43e4252963","initial-advertise-peer-urls":["https://192.168.64.28:2380"],"listen-peer-urls":["https://192.168.64.28:2380"],"advertise-client-urls":["https://192.168.64.28:2379"],"listen-client-urls":["https://127.0.0.1:2379","https://192.168.64.28:2379"],"listen-metrics-urls":["http://127.0.0.1:2381"]}
	{"level":"info","ts":"2022-09-21T22:27:08.282Z","caller":"etcdserver/server.go:752","msg":"starting initial election tick advance","election-ticks":10}
	{"level":"info","ts":"2022-09-21T22:27:08.283Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"d3378a43e4252963 switched to configuration voters=(15219785489916963171)"}
	{"level":"info","ts":"2022-09-21T22:27:08.283Z","caller":"membership/cluster.go:421","msg":"added member","cluster-id":"e703c3abd1a7846","local-member-id":"d3378a43e4252963","added-peer-id":"d3378a43e4252963","added-peer-peer-urls":["https://192.168.64.28:2380"]}
	{"level":"info","ts":"2022-09-21T22:27:08.283Z","caller":"membership/cluster.go:584","msg":"set initial cluster version","cluster-id":"e703c3abd1a7846","local-member-id":"d3378a43e4252963","cluster-version":"3.5"}
	{"level":"info","ts":"2022-09-21T22:27:08.283Z","caller":"api/capability.go:75","msg":"enabled capabilities for version","cluster-version":"3.5"}
	{"level":"info","ts":"2022-09-21T22:27:08.283Z","caller":"embed/etcd.go:581","msg":"serving peer traffic","address":"192.168.64.28:2380"}
	{"level":"info","ts":"2022-09-21T22:27:08.285Z","caller":"embed/etcd.go:553","msg":"cmux::serve","address":"192.168.64.28:2380"}
	{"level":"info","ts":"2022-09-21T22:27:08.283Z","caller":"embed/etcd.go:763","msg":"serving metrics","address":"http://127.0.0.1:2381"}
	{"level":"info","ts":"2022-09-21T22:27:09.547Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"d3378a43e4252963 is starting a new election at term 4"}
	{"level":"info","ts":"2022-09-21T22:27:09.547Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"d3378a43e4252963 became pre-candidate at term 4"}
	{"level":"info","ts":"2022-09-21T22:27:09.547Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"d3378a43e4252963 received MsgPreVoteResp from d3378a43e4252963 at term 4"}
	{"level":"info","ts":"2022-09-21T22:27:09.547Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"d3378a43e4252963 became candidate at term 5"}
	{"level":"info","ts":"2022-09-21T22:27:09.547Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"d3378a43e4252963 received MsgVoteResp from d3378a43e4252963 at term 5"}
	{"level":"info","ts":"2022-09-21T22:27:09.547Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"d3378a43e4252963 became leader at term 5"}
	{"level":"info","ts":"2022-09-21T22:27:09.547Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: d3378a43e4252963 elected leader d3378a43e4252963 at term 5"}
	{"level":"info","ts":"2022-09-21T22:27:09.547Z","caller":"etcdserver/server.go:2042","msg":"published local member to cluster through raft","local-member-id":"d3378a43e4252963","local-member-attributes":"{Name:pause-20220921152522-3535 ClientURLs:[https://192.168.64.28:2379]}","request-path":"/0/members/d3378a43e4252963/attributes","cluster-id":"e703c3abd1a7846","publish-timeout":"7s"}
	{"level":"info","ts":"2022-09-21T22:27:09.548Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
	{"level":"info","ts":"2022-09-21T22:27:09.548Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"192.168.64.28:2379"}
	{"level":"info","ts":"2022-09-21T22:27:09.549Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
	{"level":"info","ts":"2022-09-21T22:27:09.549Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2022-09-21T22:27:09.550Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2022-09-21T22:27:09.550Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	
	* 
	* ==> kernel <==
	*  22:27:31 up 2 min,  0 users,  load average: 0.39, 0.20, 0.08
	Linux pause-20220921152522-3535 5.10.57 #1 SMP Sat Sep 10 02:24:46 UTC 2022 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2021.02.12"
	
	* 
	* ==> kube-apiserver [3a4741e1fe3c] <==
	* W0921 22:26:42.249889       1 logging.go:59] [core] [Channel #3 SubChannel #5] grpc: addrConn.createTransport failed to connect to {
	  "Addr": "127.0.0.1:2379",
	  "ServerName": "127.0.0.1",
	  "Attributes": null,
	  "BalancerAttributes": null,
	  "Type": 0,
	  "Metadata": null
	}. Err: connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused"
	W0921 22:26:42.252491       1 logging.go:59] [core] [Channel #4 SubChannel #6] grpc: addrConn.createTransport failed to connect to {
	  "Addr": "127.0.0.1:2379",
	  "ServerName": "127.0.0.1",
	  "Attributes": null,
	  "BalancerAttributes": null,
	  "Type": 0,
	  "Metadata": null
	}. Err: connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused"
	W0921 22:26:47.844900       1 logging.go:59] [core] [Channel #1 SubChannel #2] grpc: addrConn.createTransport failed to connect to {
	  "Addr": "127.0.0.1:2379",
	  "ServerName": "127.0.0.1",
	  "Attributes": null,
	  "BalancerAttributes": null,
	  "Type": 0,
	  "Metadata": null
	}. Err: connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused"
	E0921 22:26:51.410448       1 run.go:74] "command failed" err="context deadline exceeded"
	
	* 
	* ==> kube-apiserver [b6d4531497f3] <==
	* I0921 22:27:14.062878       1 controller.go:85] Starting OpenAPI controller
	I0921 22:27:14.063014       1 controller.go:85] Starting OpenAPI V3 controller
	I0921 22:27:14.063120       1 naming_controller.go:291] Starting NamingConditionController
	I0921 22:27:14.063157       1 establishing_controller.go:76] Starting EstablishingController
	I0921 22:27:14.063169       1 nonstructuralschema_controller.go:192] Starting NonStructuralSchemaConditionController
	I0921 22:27:14.063271       1 apiapproval_controller.go:186] Starting KubernetesAPIApprovalPolicyConformantConditionController
	I0921 22:27:14.063303       1 crd_finalizer.go:266] Starting CRDFinalizer
	I0921 22:27:14.071305       1 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	I0921 22:27:14.072396       1 dynamic_cafile_content.go:157] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I0921 22:27:14.156918       1 cache.go:39] Caches are synced for autoregister controller
	I0921 22:27:14.157381       1 cache.go:39] Caches are synced for APIServiceRegistrationController controller
	I0921 22:27:14.159134       1 cache.go:39] Caches are synced for AvailableConditionController controller
	I0921 22:27:14.160295       1 shared_informer.go:262] Caches are synced for cluster_authentication_trust_controller
	I0921 22:27:14.162748       1 apf_controller.go:305] Running API Priority and Fairness config worker
	I0921 22:27:14.164291       1 shared_informer.go:262] Caches are synced for crd-autoregister
	I0921 22:27:14.214291       1 shared_informer.go:262] Caches are synced for node_authorizer
	I0921 22:27:14.252859       1 controller.go:616] quota admission added evaluator for: leases.coordination.k8s.io
	I0921 22:27:14.849364       1 controller.go:132] OpenAPI AggregationController: action for item k8s_internal_local_delegation_chain_0000000000: Nothing (removed from the queue).
	I0921 22:27:15.061773       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I0921 22:27:15.487959       1 controller.go:616] quota admission added evaluator for: serviceaccounts
	I0921 22:27:15.496083       1 controller.go:616] quota admission added evaluator for: deployments.apps
	I0921 22:27:15.512729       1 controller.go:616] quota admission added evaluator for: daemonsets.apps
	I0921 22:27:15.525104       1 controller.go:616] quota admission added evaluator for: roles.rbac.authorization.k8s.io
	I0921 22:27:15.528873       1 controller.go:616] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I0921 22:27:26.810346       1 controller.go:616] quota admission added evaluator for: endpoints
	
	* 
	* ==> kube-controller-manager [534b0d7cd88d] <==
	* I0921 22:27:27.091965       1 node_lifecycle_controller.go:1443] Initializing eviction metric for zone: 
	W0921 22:27:27.092105       1 node_lifecycle_controller.go:1058] Missing timestamp for Node pause-20220921152522-3535. Assuming now as a timestamp.
	I0921 22:27:27.092144       1 node_lifecycle_controller.go:1259] Controller detected that zone  is now in state Normal.
	I0921 22:27:27.092272       1 event.go:294] "Event occurred" object="pause-20220921152522-3535" fieldPath="" kind="Node" apiVersion="v1" type="Normal" reason="RegisteredNode" message="Node pause-20220921152522-3535 event: Registered Node pause-20220921152522-3535 in Controller"
	I0921 22:27:27.110604       1 shared_informer.go:262] Caches are synced for TTL
	I0921 22:27:27.111981       1 shared_informer.go:262] Caches are synced for ReplicaSet
	I0921 22:27:27.112202       1 shared_informer.go:262] Caches are synced for HPA
	I0921 22:27:27.112592       1 shared_informer.go:262] Caches are synced for TTL after finished
	I0921 22:27:27.115223       1 shared_informer.go:262] Caches are synced for namespace
	I0921 22:27:27.118788       1 shared_informer.go:262] Caches are synced for job
	I0921 22:27:27.122949       1 shared_informer.go:262] Caches are synced for cronjob
	I0921 22:27:27.126944       1 shared_informer.go:262] Caches are synced for endpoint
	I0921 22:27:27.160485       1 shared_informer.go:262] Caches are synced for expand
	I0921 22:27:27.173668       1 shared_informer.go:262] Caches are synced for persistent volume
	I0921 22:27:27.175944       1 shared_informer.go:262] Caches are synced for endpoint_slice_mirroring
	I0921 22:27:27.203878       1 shared_informer.go:262] Caches are synced for attach detach
	I0921 22:27:27.211345       1 shared_informer.go:262] Caches are synced for PV protection
	I0921 22:27:27.216091       1 shared_informer.go:262] Caches are synced for resource quota
	I0921 22:27:27.220621       1 shared_informer.go:262] Caches are synced for stateful set
	I0921 22:27:27.261055       1 shared_informer.go:262] Caches are synced for endpoint_slice
	I0921 22:27:27.269364       1 shared_informer.go:262] Caches are synced for resource quota
	I0921 22:27:27.311010       1 shared_informer.go:262] Caches are synced for daemon sets
	I0921 22:27:27.654916       1 shared_informer.go:262] Caches are synced for garbage collector
	I0921 22:27:27.686746       1 shared_informer.go:262] Caches are synced for garbage collector
	I0921 22:27:27.686841       1 garbagecollector.go:163] Garbage collector: all resource monitors have synced. Proceeding to collect garbage
	
	* 
	* ==> kube-controller-manager [823942ffecb6] <==
	* I0921 22:26:49.430074       1 serving.go:348] Generated self-signed cert in-memory
	I0921 22:26:50.068771       1 controllermanager.go:178] Version: v1.25.2
	I0921 22:26:50.068811       1 controllermanager.go:180] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0921 22:26:50.069610       1 secure_serving.go:210] Serving securely on 127.0.0.1:10257
	I0921 22:26:50.069706       1 tlsconfig.go:240] "Starting DynamicServingCertificateController"
	I0921 22:26:50.069775       1 dynamic_cafile_content.go:157] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I0921 22:26:50.070146       1 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	
	* 
	* ==> kube-proxy [152338a53f1e] <==
	* I0921 22:27:16.200105       1 node.go:163] Successfully retrieved node IP: 192.168.64.28
	I0921 22:27:16.200255       1 server_others.go:138] "Detected node IP" address="192.168.64.28"
	I0921 22:27:16.200284       1 server_others.go:578] "Unknown proxy mode, assuming iptables proxy" proxyMode=""
	I0921 22:27:16.220796       1 server_others.go:199] "kube-proxy running in single-stack mode, this ipFamily is not supported" ipFamily=IPv6
	I0921 22:27:16.220810       1 server_others.go:206] "Using iptables Proxier"
	I0921 22:27:16.220829       1 proxier.go:262] "Setting route_localnet=1, use nodePortAddresses to filter loopback addresses for NodePorts to skip it https://issues.k8s.io/90259"
	I0921 22:27:16.221038       1 server.go:661] "Version info" version="v1.25.2"
	I0921 22:27:16.221047       1 server.go:663] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0921 22:27:16.221421       1 config.go:317] "Starting service config controller"
	I0921 22:27:16.221427       1 shared_informer.go:255] Waiting for caches to sync for service config
	I0921 22:27:16.221438       1 config.go:226] "Starting endpoint slice config controller"
	I0921 22:27:16.221440       1 shared_informer.go:255] Waiting for caches to sync for endpoint slice config
	I0921 22:27:16.221790       1 config.go:444] "Starting node config controller"
	I0921 22:27:16.221831       1 shared_informer.go:255] Waiting for caches to sync for node config
	I0921 22:27:16.321553       1 shared_informer.go:262] Caches are synced for endpoint slice config
	I0921 22:27:16.321868       1 shared_informer.go:262] Caches are synced for service config
	I0921 22:27:16.322427       1 shared_informer.go:262] Caches are synced for node config
	
	* 
	* ==> kube-proxy [c2e8fe8419a9] <==
	* E0921 22:26:52.417919       1 node.go:152] Failed to retrieve node info: Get "https://control-plane.minikube.internal:8443/api/v1/nodes/pause-20220921152522-3535": dial tcp 192.168.64.28:8443: connect: connection refused - error from a previous attempt: read tcp 192.168.64.28:45762->192.168.64.28:8443: read: connection reset by peer
	E0921 22:26:53.525473       1 node.go:152] Failed to retrieve node info: Get "https://control-plane.minikube.internal:8443/api/v1/nodes/pause-20220921152522-3535": dial tcp 192.168.64.28:8443: connect: connection refused
	E0921 22:26:55.541635       1 node.go:152] Failed to retrieve node info: Get "https://control-plane.minikube.internal:8443/api/v1/nodes/pause-20220921152522-3535": dial tcp 192.168.64.28:8443: connect: connection refused
	E0921 22:27:00.072196       1 node.go:152] Failed to retrieve node info: Get "https://control-plane.minikube.internal:8443/api/v1/nodes/pause-20220921152522-3535": dial tcp 192.168.64.28:8443: connect: connection refused
	
	* 
	* ==> kube-scheduler [207eee071672] <==
	* I0921 22:27:07.942128       1 serving.go:348] Generated self-signed cert in-memory
	W0921 22:27:14.136528       1 requestheader_controller.go:193] Unable to get configmap/extension-apiserver-authentication in kube-system.  Usually fixed by 'kubectl create rolebinding -n kube-system ROLEBINDING_NAME --role=extension-apiserver-authentication-reader --serviceaccount=YOUR_NS:YOUR_SA'
	W0921 22:27:14.136587       1 authentication.go:346] Error looking up in-cluster authentication configuration: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot get resource "configmaps" in API group "" in the namespace "kube-system"
	W0921 22:27:14.136596       1 authentication.go:347] Continuing without authentication configuration. This may treat all requests as anonymous.
	W0921 22:27:14.136622       1 authentication.go:348] To require authentication configuration lookup to succeed, set --authentication-tolerate-lookup-failure=false
	I0921 22:27:14.160522       1 server.go:148] "Starting Kubernetes Scheduler" version="v1.25.2"
	I0921 22:27:14.160612       1 server.go:150] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0921 22:27:14.161435       1 secure_serving.go:210] Serving securely on 127.0.0.1:10259
	I0921 22:27:14.161580       1 configmap_cafile_content.go:202] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I0921 22:27:14.163051       1 shared_informer.go:255] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	I0921 22:27:14.161599       1 tlsconfig.go:240] "Starting DynamicServingCertificateController"
	I0921 22:27:14.263724       1 shared_informer.go:262] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	* 
	* ==> kube-scheduler [d7cbc4c453b0] <==
	* W0921 22:26:56.662066       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.CSINode: Get "https://192.168.64.28:8443/apis/storage.k8s.io/v1/csinodes?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	E0921 22:26:56.662326       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.CSINode: failed to list *v1.CSINode: Get "https://192.168.64.28:8443/apis/storage.k8s.io/v1/csinodes?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	W0921 22:26:56.676873       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.ReplicaSet: Get "https://192.168.64.28:8443/apis/apps/v1/replicasets?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	E0921 22:26:56.677417       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: Get "https://192.168.64.28:8443/apis/apps/v1/replicasets?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	W0921 22:26:56.727262       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Pod: Get "https://192.168.64.28:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	E0921 22:26:56.727389       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Pod: failed to list *v1.Pod: Get "https://192.168.64.28:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	W0921 22:26:56.792874       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.PersistentVolumeClaim: Get "https://192.168.64.28:8443/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	E0921 22:26:56.792933       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: Get "https://192.168.64.28:8443/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	W0921 22:26:57.019135       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.ReplicationController: Get "https://192.168.64.28:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	E0921 22:26:57.019287       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: Get "https://192.168.64.28:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	W0921 22:26:57.111170       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.PodDisruptionBudget: Get "https://192.168.64.28:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	E0921 22:26:57.111256       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PodDisruptionBudget: failed to list *v1.PodDisruptionBudget: Get "https://192.168.64.28:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	W0921 22:26:59.563534       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Node: Get "https://192.168.64.28:8443/api/v1/nodes?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	E0921 22:26:59.563559       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://192.168.64.28:8443/api/v1/nodes?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	W0921 22:26:59.965353       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.ReplicaSet: Get "https://192.168.64.28:8443/apis/apps/v1/replicasets?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	E0921 22:26:59.965379       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: Get "https://192.168.64.28:8443/apis/apps/v1/replicasets?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	W0921 22:27:00.044825       1 reflector.go:424] pkg/server/dynamiccertificates/configmap_cafile_content.go:206: failed to list *v1.ConfigMap: Get "https://192.168.64.28:8443/api/v1/namespaces/kube-system/configmaps?fieldSelector=metadata.name%!D(MISSING)extension-apiserver-authentication&limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	E0921 22:27:00.044871       1 reflector.go:140] pkg/server/dynamiccertificates/configmap_cafile_content.go:206: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get "https://192.168.64.28:8443/api/v1/namespaces/kube-system/configmaps?fieldSelector=metadata.name%!D(MISSING)extension-apiserver-authentication&limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	W0921 22:27:00.384285       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.ReplicationController: Get "https://192.168.64.28:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	E0921 22:27:00.384326       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: Get "https://192.168.64.28:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	E0921 22:27:00.398528       1 shared_informer.go:258] unable to sync caches for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	I0921 22:27:00.398546       1 configmap_cafile_content.go:210] "Shutting down controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I0921 22:27:00.398572       1 secure_serving.go:255] Stopped listening on 127.0.0.1:10259
	I0921 22:27:00.398622       1 tlsconfig.go:255] "Shutting down DynamicServingCertificateController"
	E0921 22:27:00.398861       1 run.go:74] "command failed" err="finished without leader elect"
	
	* 
	* ==> kubelet <==
	* -- Journal begins at Wed 2022-09-21 22:25:29 UTC, ends at Wed 2022-09-21 22:27:32 UTC. --
	Sep 21 22:27:13 pause-20220921152522-3535 kubelet[5836]: E0921 22:27:13.739144    5836 kubelet.go:2448] "Error getting node" err="node \"pause-20220921152522-3535\" not found"
	Sep 21 22:27:13 pause-20220921152522-3535 kubelet[5836]: E0921 22:27:13.839713    5836 kubelet.go:2448] "Error getting node" err="node \"pause-20220921152522-3535\" not found"
	Sep 21 22:27:13 pause-20220921152522-3535 kubelet[5836]: E0921 22:27:13.940319    5836 kubelet.go:2448] "Error getting node" err="node \"pause-20220921152522-3535\" not found"
	Sep 21 22:27:14 pause-20220921152522-3535 kubelet[5836]: E0921 22:27:14.040786    5836 kubelet.go:2448] "Error getting node" err="node \"pause-20220921152522-3535\" not found"
	Sep 21 22:27:14 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:14.141509    5836 kuberuntime_manager.go:1050] "Updating runtime config through cri with podcidr" CIDR="10.244.0.0/24"
	Sep 21 22:27:14 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:14.142001    5836 kubelet_network.go:60] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="10.244.0.0/24"
	Sep 21 22:27:14 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:14.235105    5836 kubelet_node_status.go:108] "Node was previously registered" node="pause-20220921152522-3535"
	Sep 21 22:27:14 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:14.235257    5836 kubelet_node_status.go:73] "Successfully registered node" node="pause-20220921152522-3535"
	Sep 21 22:27:14 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:14.845723    5836 apiserver.go:52] "Watching apiserver"
	Sep 21 22:27:14 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:14.847588    5836 topology_manager.go:205] "Topology Admit Handler"
	Sep 21 22:27:14 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:14.847682    5836 topology_manager.go:205] "Topology Admit Handler"
	Sep 21 22:27:14 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:14.951602    5836 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/1c5b06ea-f4c2-45b9-a80e-d85983bb3282-kube-proxy\") pod \"kube-proxy-5c7jc\" (UID: \"1c5b06ea-f4c2-45b9-a80e-d85983bb3282\") " pod="kube-system/kube-proxy-5c7jc"
	Sep 21 22:27:14 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:14.951731    5836 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c5b06ea-f4c2-45b9-a80e-d85983bb3282-lib-modules\") pod \"kube-proxy-5c7jc\" (UID: \"1c5b06ea-f4c2-45b9-a80e-d85983bb3282\") " pod="kube-system/kube-proxy-5c7jc"
	Sep 21 22:27:14 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:14.951776    5836 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/1c5b06ea-f4c2-45b9-a80e-d85983bb3282-xtables-lock\") pod \"kube-proxy-5c7jc\" (UID: \"1c5b06ea-f4c2-45b9-a80e-d85983bb3282\") " pod="kube-system/kube-proxy-5c7jc"
	Sep 21 22:27:14 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:14.951850    5836 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb8f3bae-6107-4a2b-ba32-d79405830bf0-config-volume\") pod \"coredns-565d847f94-9wtnp\" (UID: \"eb8f3bae-6107-4a2b-ba32-d79405830bf0\") " pod="kube-system/coredns-565d847f94-9wtnp"
	Sep 21 22:27:14 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:14.951882    5836 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2kwd\" (UniqueName: \"kubernetes.io/projected/eb8f3bae-6107-4a2b-ba32-d79405830bf0-kube-api-access-p2kwd\") pod \"coredns-565d847f94-9wtnp\" (UID: \"eb8f3bae-6107-4a2b-ba32-d79405830bf0\") " pod="kube-system/coredns-565d847f94-9wtnp"
	Sep 21 22:27:14 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:14.951915    5836 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh2rf\" (UniqueName: \"kubernetes.io/projected/1c5b06ea-f4c2-45b9-a80e-d85983bb3282-kube-api-access-zh2rf\") pod \"kube-proxy-5c7jc\" (UID: \"1c5b06ea-f4c2-45b9-a80e-d85983bb3282\") " pod="kube-system/kube-proxy-5c7jc"
	Sep 21 22:27:14 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:14.951971    5836 reconciler.go:169] "Reconciler: start to sync state"
	Sep 21 22:27:15 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:15.748097    5836 scope.go:115] "RemoveContainer" containerID="4934b6e15931f96c8cd7409c9d9d107463001d3dbbe402bc7ecacd045cfdf26e"
	Sep 21 22:27:16 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:16.049291    5836 scope.go:115] "RemoveContainer" containerID="c2e8fe8419a96380dd14dec68931ed3399dbf26a6ff33aace75ae52a339d8568"
	Sep 21 22:27:23 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:23.685529    5836 prober_manager.go:287] "Failed to trigger a manual run" probe="Readiness"
	Sep 21 22:27:26 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:26.821517    5836 topology_manager.go:205] "Topology Admit Handler"
	Sep 21 22:27:26 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:26.979546    5836 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/host-path/f71f00f0-f421-45c2-bfe4-c1e99f11b8e5-tmp\") pod \"storage-provisioner\" (UID: \"f71f00f0-f421-45c2-bfe4-c1e99f11b8e5\") " pod="kube-system/storage-provisioner"
	Sep 21 22:27:26 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:26.979717    5836 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv2k8\" (UniqueName: \"kubernetes.io/projected/f71f00f0-f421-45c2-bfe4-c1e99f11b8e5-kube-api-access-tv2k8\") pod \"storage-provisioner\" (UID: \"f71f00f0-f421-45c2-bfe4-c1e99f11b8e5\") " pod="kube-system/storage-provisioner"
	Sep 21 22:27:27 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:27.456744    5836 pod_container_deletor.go:79] "Container not found in pod's containers" containerID="c41fc7d463dbce833eb22fe2cbe7272c863767af9f5ce4eb37b36c8efa33b012"
	
	* 
	* ==> storage-provisioner [e6a3aeef0ff7] <==
	* I0921 22:27:27.575776       1 storage_provisioner.go:116] Initializing the minikube storage provisioner...
	I0921 22:27:27.585007       1 storage_provisioner.go:141] Storage provisioner initialized, now starting service!
	I0921 22:27:27.585247       1 leaderelection.go:243] attempting to acquire leader lease kube-system/k8s.io-minikube-hostpath...
	I0921 22:27:27.589937       1 leaderelection.go:253] successfully acquired lease kube-system/k8s.io-minikube-hostpath
	I0921 22:27:27.590215       1 controller.go:835] Starting provisioner controller k8s.io/minikube-hostpath_pause-20220921152522-3535_c99c674d-e74f-4876-b9bc-cca2318207c1!
	I0921 22:27:27.591354       1 event.go:282] Event(v1.ObjectReference{Kind:"Endpoints", Namespace:"kube-system", Name:"k8s.io-minikube-hostpath", UID:"cea77369-71af-4aec-8a4d-59cc48396b09", APIVersion:"v1", ResourceVersion:"467", FieldPath:""}): type: 'Normal' reason: 'LeaderElection' pause-20220921152522-3535_c99c674d-e74f-4876-b9bc-cca2318207c1 became leader
	I0921 22:27:27.690985       1 controller.go:884] Started provisioner controller k8s.io/minikube-hostpath_pause-20220921152522-3535_c99c674d-e74f-4876-b9bc-cca2318207c1!
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p pause-20220921152522-3535 -n pause-20220921152522-3535
helpers_test.go:261: (dbg) Run:  kubectl --context pause-20220921152522-3535 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:270: non-running pods: 
helpers_test.go:272: ======> post-mortem[TestPause/serial/SecondStartNoReconfiguration]: describe non-running pods <======
helpers_test.go:275: (dbg) Run:  kubectl --context pause-20220921152522-3535 describe pod 
helpers_test.go:275: (dbg) Non-zero exit: kubectl --context pause-20220921152522-3535 describe pod : exit status 1 (36.087493ms)

                                                
                                                
** stderr ** 
	error: resource name may not be empty

                                                
                                                
** /stderr **
helpers_test.go:277: kubectl --context pause-20220921152522-3535 describe pod : exit status 1
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p pause-20220921152522-3535 -n pause-20220921152522-3535
helpers_test.go:244: <<< TestPause/serial/SecondStartNoReconfiguration FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestPause/serial/SecondStartNoReconfiguration]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-darwin-amd64 -p pause-20220921152522-3535 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-darwin-amd64 -p pause-20220921152522-3535 logs -n 25: (2.728061495s)
helpers_test.go:252: TestPause/serial/SecondStartNoReconfiguration logs: 
-- stdout --
	* 
	* ==> Audit <==
	* |---------|----------------------------------------|----------------------------------------|---------|---------|---------------------|---------------------|
	| Command |                  Args                  |                Profile                 |  User   | Version |     Start Time      |      End Time       |
	|---------|----------------------------------------|----------------------------------------|---------|---------|---------------------|---------------------|
	| start   | -p                                     | kubernetes-upgrade-20220921151918-3535 | jenkins | v1.27.0 | 21 Sep 22 15:20 PDT | 21 Sep 22 15:21 PDT |
	|         | kubernetes-upgrade-20220921151918-3535 |                                        |         |         |                     |                     |
	|         | --memory=2200                          |                                        |         |         |                     |                     |
	|         | --kubernetes-version=v1.25.2           |                                        |         |         |                     |                     |
	|         | --alsologtostderr -v=1                 |                                        |         |         |                     |                     |
	|         | --driver=hyperkit                      |                                        |         |         |                     |                     |
	| start   | -p                                     | kubernetes-upgrade-20220921151918-3535 | jenkins | v1.27.0 | 21 Sep 22 15:21 PDT |                     |
	|         | kubernetes-upgrade-20220921151918-3535 |                                        |         |         |                     |                     |
	|         | --memory=2200                          |                                        |         |         |                     |                     |
	|         | --kubernetes-version=v1.16.0           |                                        |         |         |                     |                     |
	|         | --driver=hyperkit                      |                                        |         |         |                     |                     |
	| start   | -p                                     | kubernetes-upgrade-20220921151918-3535 | jenkins | v1.27.0 | 21 Sep 22 15:21 PDT | 21 Sep 22 15:21 PDT |
	|         | kubernetes-upgrade-20220921151918-3535 |                                        |         |         |                     |                     |
	|         | --memory=2200                          |                                        |         |         |                     |                     |
	|         | --kubernetes-version=v1.25.2           |                                        |         |         |                     |                     |
	|         | --alsologtostderr -v=1                 |                                        |         |         |                     |                     |
	|         | --driver=hyperkit                      |                                        |         |         |                     |                     |
	| delete  | -p                                     | kubernetes-upgrade-20220921151918-3535 | jenkins | v1.27.0 | 21 Sep 22 15:21 PDT | 21 Sep 22 15:21 PDT |
	|         | kubernetes-upgrade-20220921151918-3535 |                                        |         |         |                     |                     |
	| start   | -p                                     | cert-expiration-20220921151821-3535    | jenkins | v1.27.0 | 21 Sep 22 15:22 PDT | 21 Sep 22 15:22 PDT |
	|         | cert-expiration-20220921151821-3535    |                                        |         |         |                     |                     |
	|         | --memory=2048                          |                                        |         |         |                     |                     |
	|         | --cert-expiration=8760h                |                                        |         |         |                     |                     |
	|         | --driver=hyperkit                      |                                        |         |         |                     |                     |
	| delete  | -p                                     | cert-expiration-20220921151821-3535    | jenkins | v1.27.0 | 21 Sep 22 15:22 PDT | 21 Sep 22 15:22 PDT |
	|         | cert-expiration-20220921151821-3535    |                                        |         |         |                     |                     |
	| start   | -p                                     | stopped-upgrade-20220921152137-3535    | jenkins | v1.27.0 | 21 Sep 22 15:23 PDT | 21 Sep 22 15:24 PDT |
	|         | stopped-upgrade-20220921152137-3535    |                                        |         |         |                     |                     |
	|         | --memory=2200 --alsologtostderr        |                                        |         |         |                     |                     |
	|         | -v=1 --driver=hyperkit                 |                                        |         |         |                     |                     |
	| start   | -p                                     | running-upgrade-20220921152233-3535    | jenkins | v1.27.0 | 21 Sep 22 15:24 PDT | 21 Sep 22 15:25 PDT |
	|         | running-upgrade-20220921152233-3535    |                                        |         |         |                     |                     |
	|         | --memory=2200 --alsologtostderr        |                                        |         |         |                     |                     |
	|         | -v=1 --driver=hyperkit                 |                                        |         |         |                     |                     |
	| delete  | -p                                     | stopped-upgrade-20220921152137-3535    | jenkins | v1.27.0 | 21 Sep 22 15:24 PDT | 21 Sep 22 15:24 PDT |
	|         | stopped-upgrade-20220921152137-3535    |                                        |         |         |                     |                     |
	| start   | -p                                     | NoKubernetes-20220921152435-3535       | jenkins | v1.27.0 | 21 Sep 22 15:24 PDT |                     |
	|         | NoKubernetes-20220921152435-3535       |                                        |         |         |                     |                     |
	|         | --no-kubernetes                        |                                        |         |         |                     |                     |
	|         | --kubernetes-version=1.20              |                                        |         |         |                     |                     |
	|         | --driver=hyperkit                      |                                        |         |         |                     |                     |
	| start   | -p                                     | NoKubernetes-20220921152435-3535       | jenkins | v1.27.0 | 21 Sep 22 15:24 PDT | 21 Sep 22 15:25 PDT |
	|         | NoKubernetes-20220921152435-3535       |                                        |         |         |                     |                     |
	|         | --driver=hyperkit                      |                                        |         |         |                     |                     |
	| delete  | -p                                     | running-upgrade-20220921152233-3535    | jenkins | v1.27.0 | 21 Sep 22 15:25 PDT | 21 Sep 22 15:25 PDT |
	|         | running-upgrade-20220921152233-3535    |                                        |         |         |                     |                     |
	| start   | -p                                     | NoKubernetes-20220921152435-3535       | jenkins | v1.27.0 | 21 Sep 22 15:25 PDT | 21 Sep 22 15:25 PDT |
	|         | NoKubernetes-20220921152435-3535       |                                        |         |         |                     |                     |
	|         | --no-kubernetes                        |                                        |         |         |                     |                     |
	|         | --driver=hyperkit                      |                                        |         |         |                     |                     |
	| start   | -p pause-20220921152522-3535           | pause-20220921152522-3535              | jenkins | v1.27.0 | 21 Sep 22 15:25 PDT | 21 Sep 22 15:26 PDT |
	|         | --memory=2048                          |                                        |         |         |                     |                     |
	|         | --install-addons=false                 |                                        |         |         |                     |                     |
	|         | --wait=all --driver=hyperkit           |                                        |         |         |                     |                     |
	| delete  | -p                                     | NoKubernetes-20220921152435-3535       | jenkins | v1.27.0 | 21 Sep 22 15:25 PDT | 21 Sep 22 15:25 PDT |
	|         | NoKubernetes-20220921152435-3535       |                                        |         |         |                     |                     |
	| start   | -p                                     | NoKubernetes-20220921152435-3535       | jenkins | v1.27.0 | 21 Sep 22 15:25 PDT | 21 Sep 22 15:25 PDT |
	|         | NoKubernetes-20220921152435-3535       |                                        |         |         |                     |                     |
	|         | --no-kubernetes                        |                                        |         |         |                     |                     |
	|         | --driver=hyperkit                      |                                        |         |         |                     |                     |
	| ssh     | -p                                     | NoKubernetes-20220921152435-3535       | jenkins | v1.27.0 | 21 Sep 22 15:25 PDT |                     |
	|         | NoKubernetes-20220921152435-3535       |                                        |         |         |                     |                     |
	|         | sudo systemctl is-active --quiet       |                                        |         |         |                     |                     |
	|         | service kubelet                        |                                        |         |         |                     |                     |
	| profile | list                                   | minikube                               | jenkins | v1.27.0 | 21 Sep 22 15:25 PDT | 21 Sep 22 15:25 PDT |
	| profile | list --output=json                     | minikube                               | jenkins | v1.27.0 | 21 Sep 22 15:25 PDT | 21 Sep 22 15:25 PDT |
	| stop    | -p                                     | NoKubernetes-20220921152435-3535       | jenkins | v1.27.0 | 21 Sep 22 15:25 PDT | 21 Sep 22 15:25 PDT |
	|         | NoKubernetes-20220921152435-3535       |                                        |         |         |                     |                     |
	| start   | -p                                     | NoKubernetes-20220921152435-3535       | jenkins | v1.27.0 | 21 Sep 22 15:25 PDT | 21 Sep 22 15:26 PDT |
	|         | NoKubernetes-20220921152435-3535       |                                        |         |         |                     |                     |
	|         | --driver=hyperkit                      |                                        |         |         |                     |                     |
	| ssh     | -p                                     | NoKubernetes-20220921152435-3535       | jenkins | v1.27.0 | 21 Sep 22 15:26 PDT |                     |
	|         | NoKubernetes-20220921152435-3535       |                                        |         |         |                     |                     |
	|         | sudo systemctl is-active --quiet       |                                        |         |         |                     |                     |
	|         | service kubelet                        |                                        |         |         |                     |                     |
	| delete  | -p                                     | NoKubernetes-20220921152435-3535       | jenkins | v1.27.0 | 21 Sep 22 15:26 PDT | 21 Sep 22 15:26 PDT |
	|         | NoKubernetes-20220921152435-3535       |                                        |         |         |                     |                     |
	| start   | -p false-20220921151637-3535           | false-20220921151637-3535              | jenkins | v1.27.0 | 21 Sep 22 15:26 PDT |                     |
	|         | --memory=2048                          |                                        |         |         |                     |                     |
	|         | --alsologtostderr --wait=true          |                                        |         |         |                     |                     |
	|         | --wait-timeout=5m --cni=false          |                                        |         |         |                     |                     |
	|         | --driver=hyperkit                      |                                        |         |         |                     |                     |
	| start   | -p pause-20220921152522-3535           | pause-20220921152522-3535              | jenkins | v1.27.0 | 21 Sep 22 15:26 PDT | 21 Sep 22 15:27 PDT |
	|         | --alsologtostderr -v=1                 |                                        |         |         |                     |                     |
	|         | --driver=hyperkit                      |                                        |         |         |                     |                     |
	|---------|----------------------------------------|----------------------------------------|---------|---------|---------------------|---------------------|
	
	* 
	* ==> Last Start <==
	* Log file created at: 2022/09/21 15:26:16
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.19.1 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0921 15:26:16.412297   10408 out.go:296] Setting OutFile to fd 1 ...
	I0921 15:26:16.412857   10408 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0921 15:26:16.412883   10408 out.go:309] Setting ErrFile to fd 2...
	I0921 15:26:16.412925   10408 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0921 15:26:16.413172   10408 root.go:333] Updating PATH: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/bin
	I0921 15:26:16.413935   10408 out.go:303] Setting JSON to false
	I0921 15:26:16.429337   10408 start.go:115] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":5147,"bootTime":1663794029,"procs":382,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"12.6","kernelVersion":"21.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0921 15:26:16.429439   10408 start.go:123] gopshost.Virtualization returned error: not implemented yet
	I0921 15:26:16.451061   10408 out.go:177] * [pause-20220921152522-3535] minikube v1.27.0 on Darwin 12.6
	I0921 15:26:16.492895   10408 notify.go:214] Checking for updates...
	I0921 15:26:16.513942   10408 out.go:177]   - MINIKUBE_LOCATION=14995
	I0921 15:26:16.535147   10408 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/kubeconfig
	I0921 15:26:16.555899   10408 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0921 15:26:16.577004   10408 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0921 15:26:16.598036   10408 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube
	I0921 15:26:16.619232   10408 config.go:180] Loaded profile config "pause-20220921152522-3535": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.2
	I0921 15:26:16.619572   10408 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:26:16.619620   10408 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:26:16.626042   10408 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:52950
	I0921 15:26:16.626541   10408 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:26:16.626992   10408 main.go:134] libmachine: Using API Version  1
	I0921 15:26:16.627004   10408 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:26:16.627211   10408 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:26:16.627372   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:16.627501   10408 driver.go:365] Setting default libvirt URI to qemu:///system
	I0921 15:26:16.627783   10408 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:26:16.627806   10408 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:26:16.634000   10408 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:52952
	I0921 15:26:16.634367   10408 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:26:16.634679   10408 main.go:134] libmachine: Using API Version  1
	I0921 15:26:16.634691   10408 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:26:16.634960   10408 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:26:16.635067   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:16.661930   10408 out.go:177] * Using the hyperkit driver based on existing profile
	I0921 15:26:16.703890   10408 start.go:284] selected driver: hyperkit
	I0921 15:26:16.703910   10408 start.go:808] validating driver "hyperkit" against &{Name:pause-20220921152522-3535 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube/iso/minikube-v1.27.0-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase:v0.0.34@sha256:f2a1e577e43fd6769f35cdb938f6d21c3dacfd763062d119cade738fa244720c Memory:2048 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.25.2 ClusterNam
e:pause-20220921152522-3535 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP:192.168.64.28 Port:8443 KubernetesVersion:v1.25.2 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort
:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath:/opt/socket_vmnet/bin/socket_vmnet_client SocketVMnetPath:/var/run/socket_vmnet}
	I0921 15:26:16.704025   10408 start.go:819] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0921 15:26:16.704092   10408 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0921 15:26:16.704203   10408 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0921 15:26:16.710571   10408 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.27.0
	I0921 15:26:16.713621   10408 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:26:16.713649   10408 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0921 15:26:16.715630   10408 cni.go:95] Creating CNI manager for ""
	I0921 15:26:16.715647   10408 cni.go:169] CNI unnecessary in this configuration, recommending no CNI
	I0921 15:26:16.715664   10408 start_flags.go:316] config:
	{Name:pause-20220921152522-3535 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube/iso/minikube-v1.27.0-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase:v0.0.34@sha256:f2a1e577e43fd6769f35cdb938f6d21c3dacfd763062d119cade738fa244720c Memory:2048 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.25.2 ClusterName:pause-20220921152522-3535 Namespace:default APIServerName:minikubeCA APIServerNa
mes:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP:192.168.64.28 Port:8443 KubernetesVersion:v1.25.2 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMe
trics:false CustomQemuFirmwarePath: SocketVMnetClientPath:/opt/socket_vmnet/bin/socket_vmnet_client SocketVMnetPath:/var/run/socket_vmnet}
	I0921 15:26:16.715818   10408 iso.go:124] acquiring lock: {Name:mke8c57399926d29e846b47dd4be4625ba5fcaea Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0921 15:26:16.774023   10408 out.go:177] * Starting control plane node pause-20220921152522-3535 in cluster pause-20220921152522-3535
	I0921 15:26:14.112290   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | 2022/09/21 15:26:14 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0921 15:26:14.112374   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | 2022/09/21 15:26:14 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0921 15:26:14.112386   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | 2022/09/21 15:26:14 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0921 15:26:15.320346   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | Attempt 3
	I0921 15:26:15.320365   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0921 15:26:15.320474   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | hyperkit pid from json: 10400
	I0921 15:26:15.321107   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | Searching for 36:15:df:cc:5b:5b in /var/db/dhcpd_leases ...
	I0921 15:26:15.321174   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | Found 28 entries in /var/db/dhcpd_leases!
	I0921 15:26:15.321185   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.29 HWAddress:3e:7a:92:24:5:ce ID:1,3e:7a:92:24:5:ce Lease:0x632b8f7f}
	I0921 15:26:15.321194   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.28 HWAddress:c2:90:21:6e:75:6 ID:1,c2:90:21:6e:75:6 Lease:0x632ce0da}
	I0921 15:26:15.321202   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.27 HWAddress:9e:f3:b1:1c:9b:1c ID:1,9e:f3:b1:1c:9b:1c Lease:0x632b8f54}
	I0921 15:26:15.321211   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.26 HWAddress:66:c5:83:6d:55:91 ID:1,66:c5:83:6d:55:91 Lease:0x632ce03b}
	I0921 15:26:15.321220   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.25 HWAddress:ea:9c:f4:77:1d:3d ID:1,ea:9c:f4:77:1d:3d Lease:0x632ce076}
	I0921 15:26:15.321227   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.24 HWAddress:36:e:45:14:25:55 ID:1,36:e:45:14:25:55 Lease:0x632cdfb6}
	I0921 15:26:15.321236   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.23 HWAddress:92:2e:30:54:49:f3 ID:1,92:2e:30:54:49:f3 Lease:0x632b8de5}
	I0921 15:26:15.321243   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.22 HWAddress:1a:83:83:3:65:1a ID:1,1a:83:83:3:65:1a Lease:0x632cdf36}
	I0921 15:26:15.321252   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.21 HWAddress:b6:1a:2d:8:65:c5 ID:1,b6:1a:2d:8:65:c5 Lease:0x632cdf16}
	I0921 15:26:15.321259   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.20 HWAddress:72:4c:c8:cf:4f:63 ID:1,72:4c:c8:cf:4f:63 Lease:0x632b8dac}
	I0921 15:26:15.321274   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.19 HWAddress:c2:f8:ac:87:d9:f0 ID:1,c2:f8:ac:87:d9:f0 Lease:0x632b8d80}
	I0921 15:26:15.321291   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:62:35:c1:26:64:c0 ID:1,62:35:c1:26:64:c0 Lease:0x632b8d81}
	I0921 15:26:15.321303   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:96:24:b5:8e:13:fc ID:1,96:24:b5:8e:13:fc Lease:0x632cde86}
	I0921 15:26:15.321315   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:e:f1:67:89:3f:e3 ID:1,e:f1:67:89:3f:e3 Lease:0x632cde14}
	I0921 15:26:15.321324   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:a2:3d:49:78:3b:4c ID:1,a2:3d:49:78:3b:4c Lease:0x632cdd68}
	I0921 15:26:15.321339   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:1a:dd:bc:c:73:c4 ID:1,1a:dd:bc:c:73:c4 Lease:0x632cdd35}
	I0921 15:26:15.321350   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:52:e5:24:3b:ab:4 ID:1,52:e5:24:3b:ab:4 Lease:0x632b897b}
	I0921 15:26:15.321358   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:be:b4:fe:f4:b1:24 ID:1,be:b4:fe:f4:b1:24 Lease:0x632b8bde}
	I0921 15:26:15.321365   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:8a:c8:9b:80:80:10 ID:1,8a:c8:9b:80:80:10 Lease:0x632b8bdc}
	I0921 15:26:15.321376   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:12:72:ad:9f:f1:8f ID:1,12:72:ad:9f:f1:8f Lease:0x632b8511}
	I0921 15:26:15.321387   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:4a:58:20:58:21:84 ID:1,4a:58:20:58:21:84 Lease:0x632b84fc}
	I0921 15:26:15.321395   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:4e:eb:64:20:d8:40 ID:1,4e:eb:64:20:d8:40 Lease:0x632b84d4}
	I0921 15:26:15.321404   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:96:cb:c8:56:48:73 ID:1,96:cb:c8:56:48:73 Lease:0x632cd609}
	I0921 15:26:15.321411   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:3e:60:ad:7c:55:a0 ID:1,3e:60:ad:7c:55:a0 Lease:0x632cd5c9}
	I0921 15:26:15.321418   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:2:7a:1a:6a:a6:1f ID:1,2:7a:1a:6a:a6:1f Lease:0x632b843f}
	I0921 15:26:15.321426   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:9a:e7:f8:d0:27:5a ID:1,9a:e7:f8:d0:27:5a Lease:0x632cd449}
	I0921 15:26:15.321434   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:12:80:14:fc:de:ba ID:1,12:80:14:fc:de:ba Lease:0x632b82be}
	I0921 15:26:15.321440   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:56:cf:47:52:47:7e ID:1,56:cf:47:52:47:7e Lease:0x632b8281}
	I0921 15:26:17.321647   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | Attempt 4
	I0921 15:26:17.321668   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0921 15:26:17.321761   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | hyperkit pid from json: 10400
	I0921 15:26:17.322288   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | Searching for 36:15:df:cc:5b:5b in /var/db/dhcpd_leases ...
	I0921 15:26:17.322356   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | Found 29 entries in /var/db/dhcpd_leases!
	I0921 15:26:17.322367   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.30 HWAddress:36:15:df:cc:5b:5b ID:1,36:15:df:cc:5b:5b Lease:0x632ce108}
	I0921 15:26:17.322380   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | Found match: 36:15:df:cc:5b:5b
	I0921 15:26:17.322390   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | IP: 192.168.64.30
	I0921 15:26:17.322428   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetConfigRaw
	I0921 15:26:17.322951   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .DriverName
	I0921 15:26:17.323049   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .DriverName
	I0921 15:26:17.323142   10389 main.go:134] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0921 15:26:17.323154   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetState
	I0921 15:26:17.323221   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0921 15:26:17.323276   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | hyperkit pid from json: 10400
	I0921 15:26:17.323815   10389 main.go:134] libmachine: Detecting operating system of created instance...
	I0921 15:26:17.323821   10389 main.go:134] libmachine: Waiting for SSH to be available...
	I0921 15:26:17.323832   10389 main.go:134] libmachine: Getting to WaitForSSH function...
	I0921 15:26:17.323840   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHHostname
	I0921 15:26:17.323909   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHPort
	I0921 15:26:17.323997   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:17.324070   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:17.324148   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHUsername
	I0921 15:26:17.324242   10389 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:17.324383   10389 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.30 22 <nil> <nil>}
	I0921 15:26:17.324389   10389 main.go:134] libmachine: About to run SSH command:
	exit 0
	I0921 15:26:16.794876   10408 preload.go:132] Checking if preload exists for k8s version v1.25.2 and runtime docker
	I0921 15:26:16.794956   10408 preload.go:148] Found local preload: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.25.2-docker-overlay2-amd64.tar.lz4
	I0921 15:26:16.795012   10408 cache.go:57] Caching tarball of preloaded images
	I0921 15:26:16.795122   10408 preload.go:174] Found /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.25.2-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0921 15:26:16.795144   10408 cache.go:60] Finished verifying existence of preloaded tar for  v1.25.2 on docker
	I0921 15:26:16.795239   10408 profile.go:148] Saving config to /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/config.json ...
	I0921 15:26:16.795594   10408 cache.go:208] Successfully downloaded all kic artifacts
	I0921 15:26:16.795620   10408 start.go:364] acquiring machines lock for pause-20220921152522-3535: {Name:mk2f7774d81f069136708da9f7558413d7930511 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0921 15:26:19.803647   10408 start.go:368] acquired machines lock for "pause-20220921152522-3535" in 3.008011859s
	I0921 15:26:19.803693   10408 start.go:96] Skipping create...Using existing machine configuration
	I0921 15:26:19.803704   10408 fix.go:55] fixHost starting: 
	I0921 15:26:19.804014   10408 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:26:19.804040   10408 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:26:19.810489   10408 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:52975
	I0921 15:26:19.810845   10408 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:26:19.811156   10408 main.go:134] libmachine: Using API Version  1
	I0921 15:26:19.811167   10408 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:26:19.811357   10408 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:26:19.811458   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:19.811557   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetState
	I0921 15:26:19.811664   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0921 15:26:19.811739   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | hyperkit pid from json: 10295
	I0921 15:26:19.812542   10408 fix.go:103] recreateIfNeeded on pause-20220921152522-3535: state=Running err=<nil>
	W0921 15:26:19.812564   10408 fix.go:129] unexpected machine state, will restart: <nil>
	I0921 15:26:19.835428   10408 out.go:177] * Updating the running hyperkit "pause-20220921152522-3535" VM ...
	I0921 15:26:19.856170   10408 machine.go:88] provisioning docker machine ...
	I0921 15:26:19.856192   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:19.856377   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetMachineName
	I0921 15:26:19.856478   10408 buildroot.go:166] provisioning hostname "pause-20220921152522-3535"
	I0921 15:26:19.856489   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetMachineName
	I0921 15:26:19.856574   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:19.856646   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:26:19.856744   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.856835   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.856914   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:26:19.857028   10408 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:19.857193   10408 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.28 22 <nil> <nil>}
	I0921 15:26:19.857203   10408 main.go:134] libmachine: About to run SSH command:
	sudo hostname pause-20220921152522-3535 && echo "pause-20220921152522-3535" | sudo tee /etc/hostname
	I0921 15:26:19.929633   10408 main.go:134] libmachine: SSH cmd err, output: <nil>: pause-20220921152522-3535
	
	I0921 15:26:19.929693   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:19.929883   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:26:19.930020   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.930143   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.930253   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:26:19.930438   10408 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:19.930577   10408 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.28 22 <nil> <nil>}
	I0921 15:26:19.930595   10408 main.go:134] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\spause-20220921152522-3535' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 pause-20220921152522-3535/g' /etc/hosts;
				else 
					echo '127.0.1.1 pause-20220921152522-3535' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0921 15:26:19.992780   10408 main.go:134] libmachine: SSH cmd err, output: <nil>: 
	I0921 15:26:19.992803   10408 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube CaCertPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/key.pem ServerCertRemotePath
:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube}
	I0921 15:26:19.992832   10408 buildroot.go:174] setting up certificates
	I0921 15:26:19.992843   10408 provision.go:83] configureAuth start
	I0921 15:26:19.992852   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetMachineName
	I0921 15:26:19.993017   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetIP
	I0921 15:26:19.993132   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:19.993213   10408 provision.go:138] copyHostCerts
	I0921 15:26:19.993302   10408 exec_runner.go:144] found /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/cert.pem, removing ...
	I0921 15:26:19.993310   10408 exec_runner.go:207] rm: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/cert.pem
	I0921 15:26:19.993450   10408 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/cert.pem (1123 bytes)
	I0921 15:26:19.993643   10408 exec_runner.go:144] found /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/key.pem, removing ...
	I0921 15:26:19.993649   10408 exec_runner.go:207] rm: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/key.pem
	I0921 15:26:19.993780   10408 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/key.pem (1679 bytes)
	I0921 15:26:19.994087   10408 exec_runner.go:144] found /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.pem, removing ...
	I0921 15:26:19.994094   10408 exec_runner.go:207] rm: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.pem
	I0921 15:26:19.994203   10408 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.pem (1078 bytes)
	I0921 15:26:19.994341   10408 provision.go:112] generating server cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca-key.pem org=jenkins.pause-20220921152522-3535 san=[192.168.64.28 192.168.64.28 localhost 127.0.0.1 minikube pause-20220921152522-3535]
	I0921 15:26:20.145157   10408 provision.go:172] copyRemoteCerts
	I0921 15:26:20.145229   10408 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0921 15:26:20.145247   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:20.145395   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:26:20.145492   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.145591   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:26:20.145687   10408 sshutil.go:53] new ssh client: &{IP:192.168.64.28 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/pause-20220921152522-3535/id_rsa Username:docker}
	I0921 15:26:20.181860   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0921 15:26:20.204288   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/server.pem --> /etc/docker/server.pem (1249 bytes)
	I0921 15:26:20.223046   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0921 15:26:20.242859   10408 provision.go:86] duration metric: configureAuth took 250.000259ms
	I0921 15:26:20.242872   10408 buildroot.go:189] setting minikube options for container-runtime
	I0921 15:26:20.243031   10408 config.go:180] Loaded profile config "pause-20220921152522-3535": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.2
	I0921 15:26:20.243050   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:20.243218   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:20.243320   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:26:20.243440   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.243555   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.243661   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:26:20.243798   10408 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:20.243914   10408 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.28 22 <nil> <nil>}
	I0921 15:26:20.243922   10408 main.go:134] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0921 15:26:20.307004   10408 main.go:134] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0921 15:26:20.307030   10408 buildroot.go:70] root file system type: tmpfs
	I0921 15:26:20.307188   10408 provision.go:309] Updating docker unit: /lib/systemd/system/docker.service ...
	I0921 15:26:20.307206   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:20.307379   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:26:20.307501   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.307587   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.307679   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:26:20.307823   10408 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:20.307954   10408 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.28 22 <nil> <nil>}
	I0921 15:26:20.308011   10408 main.go:134] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0921 15:26:20.380017   10408 main.go:134] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0921 15:26:20.380044   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:20.380193   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:26:20.380302   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.380410   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.380514   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:26:20.380665   10408 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:20.380781   10408 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.28 22 <nil> <nil>}
	I0921 15:26:20.380797   10408 main.go:134] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0921 15:26:20.447616   10408 main.go:134] libmachine: SSH cmd err, output: <nil>: 
	I0921 15:26:20.447629   10408 machine.go:91] provisioned docker machine in 591.445478ms
	I0921 15:26:20.447641   10408 start.go:300] post-start starting for "pause-20220921152522-3535" (driver="hyperkit")
	I0921 15:26:20.447646   10408 start.go:328] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0921 15:26:20.447659   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:20.447885   10408 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0921 15:26:20.447901   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:20.448051   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:26:20.448156   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.448291   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:26:20.448405   10408 sshutil.go:53] new ssh client: &{IP:192.168.64.28 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/pause-20220921152522-3535/id_rsa Username:docker}
	I0921 15:26:20.484862   10408 ssh_runner.go:195] Run: cat /etc/os-release
	I0921 15:26:20.487726   10408 info.go:137] Remote host: Buildroot 2021.02.12
	I0921 15:26:20.487742   10408 filesync.go:126] Scanning /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/addons for local assets ...
	I0921 15:26:20.487867   10408 filesync.go:126] Scanning /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files for local assets ...
	I0921 15:26:20.488046   10408 filesync.go:149] local asset: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files/etc/ssl/certs/35352.pem -> 35352.pem in /etc/ssl/certs
	I0921 15:26:20.488202   10408 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0921 15:26:20.495074   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files/etc/ssl/certs/35352.pem --> /etc/ssl/certs/35352.pem (1708 bytes)
	I0921 15:26:20.515167   10408 start.go:303] post-start completed in 67.502258ms
	I0921 15:26:20.515187   10408 fix.go:57] fixHost completed within 711.484594ms
	I0921 15:26:20.515203   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:20.515368   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:26:20.515520   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.515638   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.515770   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:26:20.515941   10408 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:20.516053   10408 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.28 22 <nil> <nil>}
	I0921 15:26:20.516063   10408 main.go:134] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0921 15:26:20.577712   10408 main.go:134] libmachine: SSH cmd err, output: <nil>: 1663799180.686854068
	
	I0921 15:26:20.577735   10408 fix.go:207] guest clock: 1663799180.686854068
	I0921 15:26:20.577746   10408 fix.go:220] Guest: 2022-09-21 15:26:20.686854068 -0700 PDT Remote: 2022-09-21 15:26:20.51519 -0700 PDT m=+4.146234536 (delta=171.664068ms)
	I0921 15:26:20.577765   10408 fix.go:191] guest clock delta is within tolerance: 171.664068ms
	I0921 15:26:20.577770   10408 start.go:83] releasing machines lock for "pause-20220921152522-3535", held for 774.111447ms
	I0921 15:26:20.577789   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:20.577928   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetIP
	I0921 15:26:20.578042   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:20.578174   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:20.578318   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:20.578705   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:20.578809   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:26:20.578906   10408 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0921 15:26:20.578961   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:20.578984   10408 ssh_runner.go:195] Run: systemctl --version
	I0921 15:26:20.578999   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:26:20.579066   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:26:20.579106   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:26:20.579182   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.579228   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:26:20.579290   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:26:20.579338   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:26:20.579415   10408 sshutil.go:53] new ssh client: &{IP:192.168.64.28 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/pause-20220921152522-3535/id_rsa Username:docker}
	I0921 15:26:20.579448   10408 sshutil.go:53] new ssh client: &{IP:192.168.64.28 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/pause-20220921152522-3535/id_rsa Username:docker}
	I0921 15:26:20.650058   10408 preload.go:132] Checking if preload exists for k8s version v1.25.2 and runtime docker
	I0921 15:26:20.650150   10408 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0921 15:26:20.668593   10408 docker.go:611] Got preloaded images: -- stdout --
	registry.k8s.io/kube-apiserver:v1.25.2
	registry.k8s.io/kube-controller-manager:v1.25.2
	registry.k8s.io/kube-scheduler:v1.25.2
	registry.k8s.io/kube-proxy:v1.25.2
	registry.k8s.io/pause:3.8
	registry.k8s.io/etcd:3.5.4-0
	registry.k8s.io/coredns/coredns:v1.9.3
	k8s.gcr.io/pause:3.6
	gcr.io/k8s-minikube/storage-provisioner:v5
	
	-- /stdout --
	I0921 15:26:20.668610   10408 docker.go:542] Images already preloaded, skipping extraction
	I0921 15:26:20.668676   10408 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0921 15:26:20.679656   10408 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0921 15:26:20.692651   10408 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0921 15:26:20.702013   10408 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	image-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0921 15:26:20.715942   10408 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0921 15:26:20.844184   10408 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0921 15:26:20.974988   10408 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0921 15:26:21.117162   10408 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0921 15:26:18.404949   10389 main.go:134] libmachine: SSH cmd err, output: <nil>: 
	I0921 15:26:18.404961   10389 main.go:134] libmachine: Detecting the provisioner...
	I0921 15:26:18.404967   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHHostname
	I0921 15:26:18.405102   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHPort
	I0921 15:26:18.405195   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:18.405274   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:18.405369   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHUsername
	I0921 15:26:18.405482   10389 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:18.405601   10389 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.30 22 <nil> <nil>}
	I0921 15:26:18.405610   10389 main.go:134] libmachine: About to run SSH command:
	cat /etc/os-release
	I0921 15:26:18.483176   10389 main.go:134] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2021.02.12-1-g1be7c81-dirty
	ID=buildroot
	VERSION_ID=2021.02.12
	PRETTY_NAME="Buildroot 2021.02.12"
	
	I0921 15:26:18.483226   10389 main.go:134] libmachine: found compatible host: buildroot
	I0921 15:26:18.483233   10389 main.go:134] libmachine: Provisioning with buildroot...
	I0921 15:26:18.483245   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetMachineName
	I0921 15:26:18.483380   10389 buildroot.go:166] provisioning hostname "false-20220921151637-3535"
	I0921 15:26:18.483392   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetMachineName
	I0921 15:26:18.483485   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHHostname
	I0921 15:26:18.483579   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHPort
	I0921 15:26:18.483675   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:18.483757   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:18.483857   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHUsername
	I0921 15:26:18.483983   10389 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:18.484098   10389 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.30 22 <nil> <nil>}
	I0921 15:26:18.484107   10389 main.go:134] libmachine: About to run SSH command:
	sudo hostname false-20220921151637-3535 && echo "false-20220921151637-3535" | sudo tee /etc/hostname
	I0921 15:26:18.570488   10389 main.go:134] libmachine: SSH cmd err, output: <nil>: false-20220921151637-3535
	
	I0921 15:26:18.570510   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHHostname
	I0921 15:26:18.570653   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHPort
	I0921 15:26:18.570761   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:18.570862   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:18.570935   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHUsername
	I0921 15:26:18.571055   10389 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:18.571174   10389 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.30 22 <nil> <nil>}
	I0921 15:26:18.571186   10389 main.go:134] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfalse-20220921151637-3535' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 false-20220921151637-3535/g' /etc/hosts;
				else 
					echo '127.0.1.1 false-20220921151637-3535' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0921 15:26:18.653580   10389 main.go:134] libmachine: SSH cmd err, output: <nil>: 
	I0921 15:26:18.653600   10389 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube CaCertPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/key.pem ServerCertRemotePath
:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube}
	I0921 15:26:18.653620   10389 buildroot.go:174] setting up certificates
	I0921 15:26:18.653630   10389 provision.go:83] configureAuth start
	I0921 15:26:18.653637   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetMachineName
	I0921 15:26:18.653765   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetIP
	I0921 15:26:18.653853   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHHostname
	I0921 15:26:18.653932   10389 provision.go:138] copyHostCerts
	I0921 15:26:18.654006   10389 exec_runner.go:144] found /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.pem, removing ...
	I0921 15:26:18.654013   10389 exec_runner.go:207] rm: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.pem
	I0921 15:26:18.654127   10389 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.pem (1078 bytes)
	I0921 15:26:18.654316   10389 exec_runner.go:144] found /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/cert.pem, removing ...
	I0921 15:26:18.654322   10389 exec_runner.go:207] rm: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/cert.pem
	I0921 15:26:18.654389   10389 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/cert.pem (1123 bytes)
	I0921 15:26:18.654553   10389 exec_runner.go:144] found /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/key.pem, removing ...
	I0921 15:26:18.654559   10389 exec_runner.go:207] rm: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/key.pem
	I0921 15:26:18.654614   10389 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/key.pem (1679 bytes)
	I0921 15:26:18.654728   10389 provision.go:112] generating server cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca-key.pem org=jenkins.false-20220921151637-3535 san=[192.168.64.30 192.168.64.30 localhost 127.0.0.1 minikube false-20220921151637-3535]
	I0921 15:26:18.931086   10389 provision.go:172] copyRemoteCerts
	I0921 15:26:18.931145   10389 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0921 15:26:18.931162   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHHostname
	I0921 15:26:18.931342   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHPort
	I0921 15:26:18.931454   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:18.931547   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHUsername
	I0921 15:26:18.931640   10389 sshutil.go:53] new ssh client: &{IP:192.168.64.30 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/false-20220921151637-3535/id_rsa Username:docker}
	I0921 15:26:18.977451   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0921 15:26:18.993393   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/server.pem --> /etc/docker/server.pem (1249 bytes)
	I0921 15:26:19.009261   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0921 15:26:19.024820   10389 provision.go:86] duration metric: configureAuth took 371.177848ms
	I0921 15:26:19.024832   10389 buildroot.go:189] setting minikube options for container-runtime
	I0921 15:26:19.024951   10389 config.go:180] Loaded profile config "false-20220921151637-3535": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.2
	I0921 15:26:19.024965   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .DriverName
	I0921 15:26:19.025081   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHHostname
	I0921 15:26:19.025169   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHPort
	I0921 15:26:19.025260   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.025332   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.025427   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHUsername
	I0921 15:26:19.025536   10389 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:19.025635   10389 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.30 22 <nil> <nil>}
	I0921 15:26:19.025643   10389 main.go:134] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0921 15:26:19.103232   10389 main.go:134] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0921 15:26:19.103245   10389 buildroot.go:70] root file system type: tmpfs
	I0921 15:26:19.103367   10389 provision.go:309] Updating docker unit: /lib/systemd/system/docker.service ...
	I0921 15:26:19.103382   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHHostname
	I0921 15:26:19.103506   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHPort
	I0921 15:26:19.103596   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.103680   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.103774   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHUsername
	I0921 15:26:19.103895   10389 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:19.103995   10389 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.30 22 <nil> <nil>}
	I0921 15:26:19.104045   10389 main.go:134] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0921 15:26:19.189517   10389 main.go:134] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0921 15:26:19.189540   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHHostname
	I0921 15:26:19.189677   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHPort
	I0921 15:26:19.189768   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.189857   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.189943   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHUsername
	I0921 15:26:19.190071   10389 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:19.190182   10389 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.30 22 <nil> <nil>}
	I0921 15:26:19.190195   10389 main.go:134] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0921 15:26:19.657263   10389 main.go:134] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0921 15:26:19.657285   10389 main.go:134] libmachine: Checking connection to Docker...
	I0921 15:26:19.657293   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetURL
	I0921 15:26:19.657424   10389 main.go:134] libmachine: Docker is up and running!
	I0921 15:26:19.657433   10389 main.go:134] libmachine: Reticulating splines...
	I0921 15:26:19.657441   10389 client.go:171] LocalClient.Create took 10.876166724s
	I0921 15:26:19.657453   10389 start.go:167] duration metric: libmachine.API.Create for "false-20220921151637-3535" took 10.876232302s
	I0921 15:26:19.657465   10389 start.go:300] post-start starting for "false-20220921151637-3535" (driver="hyperkit")
	I0921 15:26:19.657470   10389 start.go:328] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0921 15:26:19.657481   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .DriverName
	I0921 15:26:19.657606   10389 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0921 15:26:19.657623   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHHostname
	I0921 15:26:19.657718   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHPort
	I0921 15:26:19.657815   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.657900   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHUsername
	I0921 15:26:19.657993   10389 sshutil.go:53] new ssh client: &{IP:192.168.64.30 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/false-20220921151637-3535/id_rsa Username:docker}
	I0921 15:26:19.701002   10389 ssh_runner.go:195] Run: cat /etc/os-release
	I0921 15:26:19.703660   10389 info.go:137] Remote host: Buildroot 2021.02.12
	I0921 15:26:19.703675   10389 filesync.go:126] Scanning /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/addons for local assets ...
	I0921 15:26:19.703763   10389 filesync.go:126] Scanning /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files for local assets ...
	I0921 15:26:19.703898   10389 filesync.go:149] local asset: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files/etc/ssl/certs/35352.pem -> 35352.pem in /etc/ssl/certs
	I0921 15:26:19.704044   10389 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0921 15:26:19.710387   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files/etc/ssl/certs/35352.pem --> /etc/ssl/certs/35352.pem (1708 bytes)
	I0921 15:26:19.725495   10389 start.go:303] post-start completed in 68.018939ms
	I0921 15:26:19.725521   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetConfigRaw
	I0921 15:26:19.726077   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetIP
	I0921 15:26:19.726225   10389 profile.go:148] Saving config to /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/config.json ...
	I0921 15:26:19.726508   10389 start.go:128] duration metric: createHost completed in 10.995583539s
	I0921 15:26:19.726524   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHHostname
	I0921 15:26:19.726609   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHPort
	I0921 15:26:19.726688   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.726756   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.726824   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHUsername
	I0921 15:26:19.726940   10389 main.go:134] libmachine: Using SSH client type: native
	I0921 15:26:19.727032   10389 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5c40] 0x13e8dc0 <nil>  [] 0s} 192.168.64.30 22 <nil> <nil>}
	I0921 15:26:19.727039   10389 main.go:134] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0921 15:26:19.803566   10389 main.go:134] libmachine: SSH cmd err, output: <nil>: 1663799179.904471962
	
	I0921 15:26:19.803578   10389 fix.go:207] guest clock: 1663799179.904471962
	I0921 15:26:19.803583   10389 fix.go:220] Guest: 2022-09-21 15:26:19.904471962 -0700 PDT Remote: 2022-09-21 15:26:19.726515 -0700 PDT m=+11.397811697 (delta=177.956962ms)
	I0921 15:26:19.803600   10389 fix.go:191] guest clock delta is within tolerance: 177.956962ms
	I0921 15:26:19.803604   10389 start.go:83] releasing machines lock for "false-20220921151637-3535", held for 11.072844405s
	I0921 15:26:19.803620   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .DriverName
	I0921 15:26:19.803781   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetIP
	I0921 15:26:19.803886   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .DriverName
	I0921 15:26:19.803980   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .DriverName
	I0921 15:26:19.804107   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .DriverName
	I0921 15:26:19.804405   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .DriverName
	I0921 15:26:19.804511   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .DriverName
	I0921 15:26:19.804569   10389 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0921 15:26:19.804599   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHHostname
	I0921 15:26:19.804676   10389 ssh_runner.go:195] Run: systemctl --version
	I0921 15:26:19.804691   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHHostname
	I0921 15:26:19.804696   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHPort
	I0921 15:26:19.804788   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.804809   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHPort
	I0921 15:26:19.804910   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHUsername
	I0921 15:26:19.804933   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:19.804984   10389 sshutil.go:53] new ssh client: &{IP:192.168.64.30 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/false-20220921151637-3535/id_rsa Username:docker}
	I0921 15:26:19.805022   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHUsername
	I0921 15:26:19.805139   10389 sshutil.go:53] new ssh client: &{IP:192.168.64.30 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/false-20220921151637-3535/id_rsa Username:docker}
	I0921 15:26:19.847227   10389 preload.go:132] Checking if preload exists for k8s version v1.25.2 and runtime docker
	I0921 15:26:19.847314   10389 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0921 15:26:19.886987   10389 docker.go:611] Got preloaded images: 
	I0921 15:26:19.887002   10389 docker.go:617] registry.k8s.io/kube-apiserver:v1.25.2 wasn't preloaded
	I0921 15:26:19.887058   10389 ssh_runner.go:195] Run: sudo cat /var/lib/docker/image/overlay2/repositories.json
	I0921 15:26:19.893540   10389 ssh_runner.go:195] Run: which lz4
	I0921 15:26:19.895930   10389 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4
	I0921 15:26:19.898413   10389 ssh_runner.go:352] existence check for /preloaded.tar.lz4: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/preloaded.tar.lz4': No such file or directory
	I0921 15:26:19.898432   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.25.2-docker-overlay2-amd64.tar.lz4 --> /preloaded.tar.lz4 (404136294 bytes)
	I0921 15:26:21.239426   10389 docker.go:576] Took 1.343526 seconds to copy over tarball
	I0921 15:26:21.239490   10389 ssh_runner.go:195] Run: sudo tar -I lz4 -C /var -xf /preloaded.tar.lz4
	I0921 15:26:24.582087   10389 ssh_runner.go:235] Completed: sudo tar -I lz4 -C /var -xf /preloaded.tar.lz4: (3.342576242s)
	I0921 15:26:24.582101   10389 ssh_runner.go:146] rm: /preloaded.tar.lz4
	I0921 15:26:24.608006   10389 ssh_runner.go:195] Run: sudo cat /var/lib/docker/image/overlay2/repositories.json
	I0921 15:26:24.614121   10389 ssh_runner.go:362] scp memory --> /var/lib/docker/image/overlay2/repositories.json (2628 bytes)
	I0921 15:26:24.625086   10389 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0921 15:26:24.705194   10389 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0921 15:26:25.931663   10389 ssh_runner.go:235] Completed: sudo systemctl restart docker: (1.226446575s)
	I0921 15:26:25.931758   10389 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0921 15:26:25.941064   10389 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0921 15:26:25.952201   10389 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0921 15:26:25.960686   10389 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0921 15:26:25.983070   10389 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0921 15:26:25.991760   10389 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	image-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0921 15:26:26.004137   10389 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0921 15:26:26.084992   10389 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0921 15:26:26.179551   10389 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0921 15:26:26.278839   10389 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0921 15:26:27.498830   10389 ssh_runner.go:235] Completed: sudo systemctl restart docker: (1.219969179s)
	I0921 15:26:27.498903   10389 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0921 15:26:27.582227   10389 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0921 15:26:27.670077   10389 ssh_runner.go:195] Run: sudo systemctl start cri-docker.socket
	I0921 15:26:27.680350   10389 start.go:450] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0921 15:26:27.680426   10389 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0921 15:26:27.684229   10389 start.go:471] Will wait 60s for crictl version
	I0921 15:26:27.684283   10389 ssh_runner.go:195] Run: sudo crictl version
	I0921 15:26:27.710285   10389 start.go:480] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  20.10.18
	RuntimeApiVersion:  1.41.0
	I0921 15:26:27.710350   10389 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0921 15:26:27.730543   10389 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0921 15:26:27.776346   10389 out.go:204] * Preparing Kubernetes v1.25.2 on Docker 20.10.18 ...
	I0921 15:26:27.776499   10389 ssh_runner.go:195] Run: grep 192.168.64.1	host.minikube.internal$ /etc/hosts
	I0921 15:26:27.779532   10389 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.64.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0921 15:26:27.786983   10389 localpath.go:92] copying /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/client.crt -> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/client.crt
	I0921 15:26:27.787207   10389 localpath.go:117] copying /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/client.key -> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/client.key
	I0921 15:26:27.787377   10389 preload.go:132] Checking if preload exists for k8s version v1.25.2 and runtime docker
	I0921 15:26:27.787423   10389 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0921 15:26:27.803222   10389 docker.go:611] Got preloaded images: -- stdout --
	registry.k8s.io/kube-apiserver:v1.25.2
	registry.k8s.io/kube-scheduler:v1.25.2
	registry.k8s.io/kube-controller-manager:v1.25.2
	registry.k8s.io/kube-proxy:v1.25.2
	registry.k8s.io/pause:3.8
	registry.k8s.io/etcd:3.5.4-0
	registry.k8s.io/coredns/coredns:v1.9.3
	gcr.io/k8s-minikube/storage-provisioner:v5
	
	-- /stdout --
	I0921 15:26:27.803238   10389 docker.go:542] Images already preloaded, skipping extraction
	I0921 15:26:27.803305   10389 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0921 15:26:27.818382   10389 docker.go:611] Got preloaded images: -- stdout --
	registry.k8s.io/kube-apiserver:v1.25.2
	registry.k8s.io/kube-scheduler:v1.25.2
	registry.k8s.io/kube-controller-manager:v1.25.2
	registry.k8s.io/kube-proxy:v1.25.2
	registry.k8s.io/pause:3.8
	registry.k8s.io/etcd:3.5.4-0
	registry.k8s.io/coredns/coredns:v1.9.3
	gcr.io/k8s-minikube/storage-provisioner:v5
	
	-- /stdout --
	I0921 15:26:27.818399   10389 cache_images.go:84] Images are preloaded, skipping loading
	I0921 15:26:27.818461   10389 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0921 15:26:27.839813   10389 cni.go:95] Creating CNI manager for "false"
	I0921 15:26:27.839834   10389 kubeadm.go:87] Using pod CIDR: 10.244.0.0/16
	I0921 15:26:27.839848   10389 kubeadm.go:156] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.64.30 APIServerPort:8443 KubernetesVersion:v1.25.2 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:false-20220921151637-3535 NodeName:false-20220921151637-3535 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.64.30"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.64.30 CgroupDriver:systemd ClientCAFile:/var/lib/minikube/certs/ca.cr
t StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false}
	I0921 15:26:27.839927   10389 kubeadm.go:161] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.64.30
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: /var/run/cri-dockerd.sock
	  name: "false-20220921151637-3535"
	  kubeletExtraArgs:
	    node-ip: 192.168.64.30
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.64.30"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.25.2
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: systemd
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0921 15:26:27.839993   10389 kubeadm.go:962] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.25.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --container-runtime=remote --container-runtime-endpoint=/var/run/cri-dockerd.sock --hostname-override=false-20220921151637-3535 --image-service-endpoint=/var/run/cri-dockerd.sock --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.64.30 --runtime-request-timeout=15m
	
	[Install]
	 config:
	{KubernetesVersion:v1.25.2 ClusterName:false-20220921151637-3535 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:false NodeIP: NodePort:8443 NodeName:}
	I0921 15:26:27.840044   10389 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.25.2
	I0921 15:26:27.846485   10389 binaries.go:44] Found k8s binaries, skipping transfer
	I0921 15:26:27.846528   10389 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I0921 15:26:27.852711   10389 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (488 bytes)
	I0921 15:26:27.863719   10389 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0921 15:26:27.874539   10389 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2050 bytes)
	I0921 15:26:27.885620   10389 ssh_runner.go:195] Run: grep 192.168.64.30	control-plane.minikube.internal$ /etc/hosts
	I0921 15:26:27.887836   10389 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.64.30	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0921 15:26:27.895111   10389 certs.go:54] Setting up /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535 for IP: 192.168.64.30
	I0921 15:26:27.895206   10389 certs.go:182] skipping minikubeCA CA generation: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.key
	I0921 15:26:27.895255   10389 certs.go:182] skipping proxyClientCA CA generation: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/proxy-client-ca.key
	I0921 15:26:27.895337   10389 certs.go:298] skipping minikube-user signed cert generation: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/client.key
	I0921 15:26:27.895361   10389 certs.go:302] generating minikube signed cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/apiserver.key.8d1fc39b
	I0921 15:26:27.895377   10389 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/apiserver.crt.8d1fc39b with IP's: [192.168.64.30 10.96.0.1 127.0.0.1 10.0.0.1]
	I0921 15:26:28.090626   10389 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/apiserver.crt.8d1fc39b ...
	I0921 15:26:28.090639   10389 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/apiserver.crt.8d1fc39b: {Name:mkd0021f0880c17472bc34f2bb7b8af87d7a861d Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0921 15:26:28.090958   10389 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/apiserver.key.8d1fc39b ...
	I0921 15:26:28.090971   10389 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/apiserver.key.8d1fc39b: {Name:mk0105b4976084bcdc477e16d22340c1f19a3c15 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0921 15:26:28.091184   10389 certs.go:320] copying /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/apiserver.crt.8d1fc39b -> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/apiserver.crt
	I0921 15:26:28.091356   10389 certs.go:324] copying /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/apiserver.key.8d1fc39b -> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/apiserver.key
	I0921 15:26:28.091534   10389 certs.go:302] generating aggregator signed cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/proxy-client.key
	I0921 15:26:28.091547   10389 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/proxy-client.crt with IP's: []
	I0921 15:26:28.128749   10389 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/proxy-client.crt ...
	I0921 15:26:28.128759   10389 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/proxy-client.crt: {Name:mkb235bcbbe39e8b7fc7fa2af71bd625a04514fb Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0921 15:26:28.129197   10389 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/proxy-client.key ...
	I0921 15:26:28.129204   10389 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/proxy-client.key: {Name:mkc7b1d50dce94488cf946b55e321c2fd8195b2c Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0921 15:26:28.129644   10389 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/3535.pem (1338 bytes)
	W0921 15:26:28.129681   10389 certs.go:384] ignoring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/3535_empty.pem, impossibly tiny 0 bytes
	I0921 15:26:28.129689   10389 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca-key.pem (1679 bytes)
	I0921 15:26:28.129738   10389 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca.pem (1078 bytes)
	I0921 15:26:28.129767   10389 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/cert.pem (1123 bytes)
	I0921 15:26:28.129794   10389 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/key.pem (1679 bytes)
	I0921 15:26:28.129854   10389 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files/etc/ssl/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files/etc/ssl/certs/35352.pem (1708 bytes)
	I0921 15:26:28.130421   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1399 bytes)
	I0921 15:26:28.147670   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0921 15:26:28.163433   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0921 15:26:28.178707   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0921 15:26:28.193799   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0921 15:26:28.208841   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0921 15:26:28.224170   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0921 15:26:28.239235   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0921 15:26:28.254997   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0921 15:26:28.270476   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/3535.pem --> /usr/share/ca-certificates/3535.pem (1338 bytes)
	I0921 15:26:28.285761   10389 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files/etc/ssl/certs/35352.pem --> /usr/share/ca-certificates/35352.pem (1708 bytes)
	I0921 15:26:28.300863   10389 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0921 15:26:28.311541   10389 ssh_runner.go:195] Run: openssl version
	I0921 15:26:28.314918   10389 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/35352.pem && ln -fs /usr/share/ca-certificates/35352.pem /etc/ssl/certs/35352.pem"
	I0921 15:26:28.322006   10389 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/35352.pem
	I0921 15:26:28.324825   10389 certs.go:431] hashing: -rw-r--r-- 1 root root 1708 Sep 21 21:31 /usr/share/ca-certificates/35352.pem
	I0921 15:26:28.324854   10389 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/35352.pem
	I0921 15:26:28.328317   10389 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/35352.pem /etc/ssl/certs/3ec20f2e.0"
	I0921 15:26:28.335399   10389 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0921 15:26:28.342321   10389 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0921 15:26:28.345213   10389 certs.go:431] hashing: -rw-r--r-- 1 root root 1111 Sep 21 21:27 /usr/share/ca-certificates/minikubeCA.pem
	I0921 15:26:28.345248   10389 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0921 15:26:28.348680   10389 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0921 15:26:28.355668   10389 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/3535.pem && ln -fs /usr/share/ca-certificates/3535.pem /etc/ssl/certs/3535.pem"
	I0921 15:26:28.362704   10389 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3535.pem
	I0921 15:26:28.365564   10389 certs.go:431] hashing: -rw-r--r-- 1 root root 1338 Sep 21 21:31 /usr/share/ca-certificates/3535.pem
	I0921 15:26:28.365597   10389 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3535.pem
	I0921 15:26:28.369054   10389 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/3535.pem /etc/ssl/certs/51391683.0"
	I0921 15:26:28.375971   10389 kubeadm.go:396] StartCluster: {Name:false-20220921151637-3535 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube/iso/minikube-v1.27.0-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase:v0.0.34@sha256:f2a1e577e43fd6769f35cdb938f6d21c3dacfd763062d119cade738fa244720c Memory:2048 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.25.2 ClusterName:false-20220921151637
-3535 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:false NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP:192.168.64.30 Port:8443 KubernetesVersion:v1.25.2 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:5m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p M
ountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath:/opt/socket_vmnet/bin/socket_vmnet_client SocketVMnetPath:/var/run/socket_vmnet}
	I0921 15:26:28.393673   10389 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0921 15:26:28.410852   10389 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0921 15:26:28.417363   10389 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I0921 15:26:28.423501   10389 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I0921 15:26:28.429757   10389 kubeadm.go:152] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I0921 15:26:28.429778   10389 ssh_runner.go:286] Start: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.2:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,Mem"
	I0921 15:26:28.485563   10389 kubeadm.go:317] [init] Using Kubernetes version: v1.25.2
	I0921 15:26:28.485628   10389 kubeadm.go:317] [preflight] Running pre-flight checks
	I0921 15:26:28.613102   10389 kubeadm.go:317] [preflight] Pulling images required for setting up a Kubernetes cluster
	I0921 15:26:28.613192   10389 kubeadm.go:317] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I0921 15:26:28.613262   10389 kubeadm.go:317] [preflight] You can also perform this action in beforehand using 'kubeadm config images pull'
	I0921 15:26:28.713134   10389 kubeadm.go:317] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I0921 15:26:29.173173   10408 ssh_runner.go:235] Completed: sudo systemctl restart docker: (8.055980768s)
	I0921 15:26:29.173240   10408 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0921 15:26:29.288535   10408 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0921 15:26:29.417731   10408 ssh_runner.go:195] Run: sudo systemctl start cri-docker.socket
	I0921 15:26:29.433270   10408 start.go:450] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0921 15:26:29.433356   10408 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0921 15:26:29.447293   10408 start.go:471] Will wait 60s for crictl version
	I0921 15:26:29.447353   10408 ssh_runner.go:195] Run: sudo crictl version
	I0921 15:26:29.482799   10408 start.go:480] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  20.10.18
	RuntimeApiVersion:  1.41.0
	I0921 15:26:29.482858   10408 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0921 15:26:29.651357   10408 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0921 15:26:29.808439   10408 out.go:204] * Preparing Kubernetes v1.25.2 on Docker 20.10.18 ...
	I0921 15:26:29.808534   10408 ssh_runner.go:195] Run: grep 192.168.64.1	host.minikube.internal$ /etc/hosts
	I0921 15:26:29.818111   10408 preload.go:132] Checking if preload exists for k8s version v1.25.2 and runtime docker
	I0921 15:26:29.818177   10408 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0921 15:26:29.873620   10408 docker.go:611] Got preloaded images: -- stdout --
	registry.k8s.io/kube-apiserver:v1.25.2
	registry.k8s.io/kube-scheduler:v1.25.2
	registry.k8s.io/kube-controller-manager:v1.25.2
	registry.k8s.io/kube-proxy:v1.25.2
	registry.k8s.io/pause:3.8
	registry.k8s.io/etcd:3.5.4-0
	registry.k8s.io/coredns/coredns:v1.9.3
	k8s.gcr.io/pause:3.6
	gcr.io/k8s-minikube/storage-provisioner:v5
	
	-- /stdout --
	I0921 15:26:29.873633   10408 docker.go:542] Images already preloaded, skipping extraction
	I0921 15:26:29.873699   10408 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0921 15:26:29.929931   10408 docker.go:611] Got preloaded images: -- stdout --
	registry.k8s.io/kube-apiserver:v1.25.2
	registry.k8s.io/kube-scheduler:v1.25.2
	registry.k8s.io/kube-controller-manager:v1.25.2
	registry.k8s.io/kube-proxy:v1.25.2
	registry.k8s.io/pause:3.8
	registry.k8s.io/etcd:3.5.4-0
	registry.k8s.io/coredns/coredns:v1.9.3
	k8s.gcr.io/pause:3.6
	gcr.io/k8s-minikube/storage-provisioner:v5
	
	-- /stdout --
	I0921 15:26:29.929952   10408 cache_images.go:84] Images are preloaded, skipping loading
	I0921 15:26:29.930056   10408 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0921 15:26:30.064287   10408 cni.go:95] Creating CNI manager for ""
	I0921 15:26:30.064305   10408 cni.go:169] CNI unnecessary in this configuration, recommending no CNI
	I0921 15:26:30.064320   10408 kubeadm.go:87] Using pod CIDR: 10.244.0.0/16
	I0921 15:26:30.064331   10408 kubeadm.go:156] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.64.28 APIServerPort:8443 KubernetesVersion:v1.25.2 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:pause-20220921152522-3535 NodeName:pause-20220921152522-3535 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.64.28"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.64.28 CgroupDriver:systemd ClientCAFile:/var/lib/minikube/certs/ca.cr
t StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false}
	I0921 15:26:30.064423   10408 kubeadm.go:161] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.64.28
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: /var/run/cri-dockerd.sock
	  name: "pause-20220921152522-3535"
	  kubeletExtraArgs:
	    node-ip: 192.168.64.28
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.64.28"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.25.2
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: systemd
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0921 15:26:30.064505   10408 kubeadm.go:962] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.25.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --container-runtime=remote --container-runtime-endpoint=/var/run/cri-dockerd.sock --hostname-override=pause-20220921152522-3535 --image-service-endpoint=/var/run/cri-dockerd.sock --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.64.28 --runtime-request-timeout=15m
	
	[Install]
	 config:
	{KubernetesVersion:v1.25.2 ClusterName:pause-20220921152522-3535 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:}
	I0921 15:26:30.064579   10408 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.25.2
	I0921 15:26:30.076550   10408 binaries.go:44] Found k8s binaries, skipping transfer
	I0921 15:26:30.076638   10408 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I0921 15:26:30.090012   10408 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (488 bytes)
	I0921 15:26:30.137803   10408 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0921 15:26:30.178146   10408 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2050 bytes)
	I0921 15:26:30.203255   10408 ssh_runner.go:195] Run: grep 192.168.64.28	control-plane.minikube.internal$ /etc/hosts
	I0921 15:26:30.209779   10408 certs.go:54] Setting up /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535 for IP: 192.168.64.28
	I0921 15:26:30.209879   10408 certs.go:182] skipping minikubeCA CA generation: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.key
	I0921 15:26:30.209934   10408 certs.go:182] skipping proxyClientCA CA generation: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/proxy-client-ca.key
	I0921 15:26:30.210019   10408 certs.go:298] skipping minikube-user signed cert generation: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/client.key
	I0921 15:26:30.210082   10408 certs.go:298] skipping minikube signed cert generation: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/apiserver.key.6733b561
	I0921 15:26:30.210133   10408 certs.go:298] skipping aggregator signed cert generation: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/proxy-client.key
	I0921 15:26:30.210333   10408 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/3535.pem (1338 bytes)
	W0921 15:26:30.210375   10408 certs.go:384] ignoring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/3535_empty.pem, impossibly tiny 0 bytes
	I0921 15:26:30.210388   10408 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca-key.pem (1679 bytes)
	I0921 15:26:30.210421   10408 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/ca.pem (1078 bytes)
	I0921 15:26:30.210453   10408 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/cert.pem (1123 bytes)
	I0921 15:26:30.210483   10408 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/key.pem (1679 bytes)
	I0921 15:26:30.210550   10408 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files/etc/ssl/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files/etc/ssl/certs/35352.pem (1708 bytes)
	I0921 15:26:30.211086   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1399 bytes)
	I0921 15:26:30.279069   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0921 15:26:30.343250   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0921 15:26:30.413180   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0921 15:26:30.448798   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0921 15:26:30.476175   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0921 15:26:30.497204   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0921 15:26:30.524103   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0921 15:26:30.558966   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files/etc/ssl/certs/35352.pem --> /usr/share/ca-certificates/35352.pem (1708 bytes)
	I0921 15:26:30.576319   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0921 15:26:30.592912   10408 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/certs/3535.pem --> /usr/share/ca-certificates/3535.pem (1338 bytes)
	I0921 15:26:30.609099   10408 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0921 15:26:30.627179   10408 ssh_runner.go:195] Run: openssl version
	I0921 15:26:30.632801   10408 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/3535.pem && ln -fs /usr/share/ca-certificates/3535.pem /etc/ssl/certs/3535.pem"
	I0921 15:26:30.641473   10408 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/3535.pem
	I0921 15:26:30.645794   10408 certs.go:431] hashing: -rw-r--r-- 1 root root 1338 Sep 21 21:31 /usr/share/ca-certificates/3535.pem
	I0921 15:26:30.645836   10408 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/3535.pem
	I0921 15:26:30.649794   10408 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/3535.pem /etc/ssl/certs/51391683.0"
	I0921 15:26:30.657630   10408 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/35352.pem && ln -fs /usr/share/ca-certificates/35352.pem /etc/ssl/certs/35352.pem"
	I0921 15:26:30.665747   10408 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/35352.pem
	I0921 15:26:30.669804   10408 certs.go:431] hashing: -rw-r--r-- 1 root root 1708 Sep 21 21:31 /usr/share/ca-certificates/35352.pem
	I0921 15:26:30.669850   10408 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/35352.pem
	I0921 15:26:30.679638   10408 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/35352.pem /etc/ssl/certs/3ec20f2e.0"
	I0921 15:26:30.700907   10408 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0921 15:26:30.734369   10408 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0921 15:26:30.762750   10408 certs.go:431] hashing: -rw-r--r-- 1 root root 1111 Sep 21 21:27 /usr/share/ca-certificates/minikubeCA.pem
	I0921 15:26:30.762827   10408 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0921 15:26:30.777627   10408 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0921 15:26:30.785856   10408 kubeadm.go:396] StartCluster: {Name:pause-20220921152522-3535 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube/iso/minikube-v1.27.0-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase:v0.0.34@sha256:f2a1e577e43fd6769f35cdb938f6d21c3dacfd763062d119cade738fa244720c Memory:2048 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.25.2 ClusterName:pause-20220921152522
-3535 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP:192.168.64.28 Port:8443 KubernetesVersion:v1.25.2 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountU
ID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath:/opt/socket_vmnet/bin/socket_vmnet_client SocketVMnetPath:/var/run/socket_vmnet}
	I0921 15:26:30.785963   10408 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0921 15:26:30.816264   10408 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0921 15:26:30.823179   10408 kubeadm.go:411] found existing configuration files, will attempt cluster restart
	I0921 15:26:30.823195   10408 kubeadm.go:627] restartCluster start
	I0921 15:26:30.823236   10408 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I0921 15:26:30.837045   10408 kubeadm.go:127] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I0921 15:26:30.837457   10408 kubeconfig.go:92] found "pause-20220921152522-3535" server: "https://192.168.64.28:8443"
	I0921 15:26:30.837839   10408 kapi.go:59] client config for pause-20220921152522-3535: &rest.Config{Host:"https://192.168.64.28:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/client.crt", KeyFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/cl
ient.key", CAFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x233b400), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0921 15:26:30.838375   10408 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I0921 15:26:30.852535   10408 api_server.go:165] Checking apiserver status ...
	I0921 15:26:30.852588   10408 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0921 15:26:30.868059   10408 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/4520/cgroup
	I0921 15:26:30.876185   10408 api_server.go:181] apiserver freezer: "2:freezer:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadc22aaa89e8234f176d6344e50152f4.slice/docker-3a4741e1fe3c0996cab4975bd514e9991794f86cf96c9fe0863c714a6d86e26c.scope"
	I0921 15:26:30.876238   10408 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadc22aaa89e8234f176d6344e50152f4.slice/docker-3a4741e1fe3c0996cab4975bd514e9991794f86cf96c9fe0863c714a6d86e26c.scope/freezer.state
	I0921 15:26:30.912452   10408 api_server.go:203] freezer state: "THAWED"
	I0921 15:26:30.912472   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:28.751035   10389 out.go:204]   - Generating certificates and keys ...
	I0921 15:26:28.751152   10389 kubeadm.go:317] [certs] Using existing ca certificate authority
	I0921 15:26:28.751236   10389 kubeadm.go:317] [certs] Using existing apiserver certificate and key on disk
	I0921 15:26:28.782482   10389 kubeadm.go:317] [certs] Generating "apiserver-kubelet-client" certificate and key
	I0921 15:26:29.137189   10389 kubeadm.go:317] [certs] Generating "front-proxy-ca" certificate and key
	I0921 15:26:29.241745   10389 kubeadm.go:317] [certs] Generating "front-proxy-client" certificate and key
	I0921 15:26:29.350166   10389 kubeadm.go:317] [certs] Generating "etcd/ca" certificate and key
	I0921 15:26:29.505698   10389 kubeadm.go:317] [certs] Generating "etcd/server" certificate and key
	I0921 15:26:29.505932   10389 kubeadm.go:317] [certs] etcd/server serving cert is signed for DNS names [false-20220921151637-3535 localhost] and IPs [192.168.64.30 127.0.0.1 ::1]
	I0921 15:26:29.604706   10389 kubeadm.go:317] [certs] Generating "etcd/peer" certificate and key
	I0921 15:26:29.604909   10389 kubeadm.go:317] [certs] etcd/peer serving cert is signed for DNS names [false-20220921151637-3535 localhost] and IPs [192.168.64.30 127.0.0.1 ::1]
	I0921 15:26:29.834088   10389 kubeadm.go:317] [certs] Generating "etcd/healthcheck-client" certificate and key
	I0921 15:26:29.943628   10389 kubeadm.go:317] [certs] Generating "apiserver-etcd-client" certificate and key
	I0921 15:26:30.177452   10389 kubeadm.go:317] [certs] Generating "sa" key and public key
	I0921 15:26:30.177562   10389 kubeadm.go:317] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I0921 15:26:30.679764   10389 kubeadm.go:317] [kubeconfig] Writing "admin.conf" kubeconfig file
	I0921 15:26:30.762950   10389 kubeadm.go:317] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I0921 15:26:30.975611   10389 kubeadm.go:317] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I0921 15:26:31.368343   10389 kubeadm.go:317] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I0921 15:26:31.380985   10389 kubeadm.go:317] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I0921 15:26:31.381763   10389 kubeadm.go:317] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I0921 15:26:31.381810   10389 kubeadm.go:317] [kubelet-start] Starting the kubelet
	I0921 15:26:31.468060   10389 kubeadm.go:317] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I0921 15:26:31.487973   10389 out.go:204]   - Booting up control plane ...
	I0921 15:26:31.488058   10389 kubeadm.go:317] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I0921 15:26:31.488140   10389 kubeadm.go:317] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I0921 15:26:31.488216   10389 kubeadm.go:317] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I0921 15:26:31.488288   10389 kubeadm.go:317] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I0921 15:26:31.488408   10389 kubeadm.go:317] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests". This can take up to 4m0s
	I0921 15:26:35.914013   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I0921 15:26:35.914061   10408 retry.go:31] will retry after 263.082536ms: state is "Stopped"
	I0921 15:26:36.179260   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:41.180983   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I0921 15:26:41.181007   10408 retry.go:31] will retry after 381.329545ms: state is "Stopped"
	I0921 15:26:43.469751   10389 kubeadm.go:317] [apiclient] All control plane components are healthy after 12.003918 seconds
	I0921 15:26:43.469852   10389 kubeadm.go:317] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I0921 15:26:43.477591   10389 kubeadm.go:317] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I0921 15:26:44.989240   10389 kubeadm.go:317] [upload-certs] Skipping phase. Please see --upload-certs
	I0921 15:26:44.989436   10389 kubeadm.go:317] [mark-control-plane] Marking the node false-20220921151637-3535 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I0921 15:26:45.496387   10389 kubeadm.go:317] [bootstrap-token] Using token: gw23ty.315hs4knjisv0ijr
	I0921 15:26:41.563913   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:45.534959   10389 out.go:204]   - Configuring RBAC rules ...
	I0921 15:26:45.535164   10389 kubeadm.go:317] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I0921 15:26:45.535348   10389 kubeadm.go:317] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I0921 15:26:45.575312   10389 kubeadm.go:317] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I0921 15:26:45.577832   10389 kubeadm.go:317] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I0921 15:26:45.580659   10389 kubeadm.go:317] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I0921 15:26:45.582707   10389 kubeadm.go:317] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I0921 15:26:45.589329   10389 kubeadm.go:317] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I0921 15:26:45.765645   10389 kubeadm.go:317] [addons] Applied essential addon: CoreDNS
	I0921 15:26:45.903347   10389 kubeadm.go:317] [addons] Applied essential addon: kube-proxy
	I0921 15:26:45.903987   10389 kubeadm.go:317] 
	I0921 15:26:45.904052   10389 kubeadm.go:317] Your Kubernetes control-plane has initialized successfully!
	I0921 15:26:45.904063   10389 kubeadm.go:317] 
	I0921 15:26:45.904125   10389 kubeadm.go:317] To start using your cluster, you need to run the following as a regular user:
	I0921 15:26:45.904133   10389 kubeadm.go:317] 
	I0921 15:26:45.904151   10389 kubeadm.go:317]   mkdir -p $HOME/.kube
	I0921 15:26:45.904270   10389 kubeadm.go:317]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I0921 15:26:45.904382   10389 kubeadm.go:317]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I0921 15:26:45.904399   10389 kubeadm.go:317] 
	I0921 15:26:45.904507   10389 kubeadm.go:317] Alternatively, if you are the root user, you can run:
	I0921 15:26:45.904518   10389 kubeadm.go:317] 
	I0921 15:26:45.904599   10389 kubeadm.go:317]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I0921 15:26:45.904608   10389 kubeadm.go:317] 
	I0921 15:26:45.904652   10389 kubeadm.go:317] You should now deploy a pod network to the cluster.
	I0921 15:26:45.904743   10389 kubeadm.go:317] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I0921 15:26:45.904821   10389 kubeadm.go:317]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I0921 15:26:45.904853   10389 kubeadm.go:317] 
	I0921 15:26:45.904929   10389 kubeadm.go:317] You can now join any number of control-plane nodes by copying certificate authorities
	I0921 15:26:45.905009   10389 kubeadm.go:317] and service account keys on each node and then running the following as root:
	I0921 15:26:45.905013   10389 kubeadm.go:317] 
	I0921 15:26:45.905081   10389 kubeadm.go:317]   kubeadm join control-plane.minikube.internal:8443 --token gw23ty.315hs4knjisv0ijr \
	I0921 15:26:45.905165   10389 kubeadm.go:317] 	--discovery-token-ca-cert-hash sha256:706daf9048108456ab2312c550f8f0627aeca112971c3da5a874015a0cee155c \
	I0921 15:26:45.905182   10389 kubeadm.go:317] 	--control-plane 
	I0921 15:26:45.905187   10389 kubeadm.go:317] 
	I0921 15:26:45.905254   10389 kubeadm.go:317] Then you can join any number of worker nodes by running the following on each as root:
	I0921 15:26:45.905261   10389 kubeadm.go:317] 
	I0921 15:26:45.905329   10389 kubeadm.go:317] kubeadm join control-plane.minikube.internal:8443 --token gw23ty.315hs4knjisv0ijr \
	I0921 15:26:45.905405   10389 kubeadm.go:317] 	--discovery-token-ca-cert-hash sha256:706daf9048108456ab2312c550f8f0627aeca112971c3da5a874015a0cee155c 
	I0921 15:26:45.906103   10389 kubeadm.go:317] W0921 22:26:28.588830    1256 initconfiguration.go:119] Usage of CRI endpoints without URL scheme is deprecated and can cause kubelet errors in the future. Automatically prepending scheme "unix" to the "criSocket" with value "/var/run/cri-dockerd.sock". Please update your configuration!
	I0921 15:26:45.906192   10389 kubeadm.go:317] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I0921 15:26:45.906207   10389 cni.go:95] Creating CNI manager for "false"
	I0921 15:26:45.906225   10389 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0921 15:26:45.906290   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:45.906301   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl label nodes minikube.k8s.io/version=v1.27.0 minikube.k8s.io/commit=937c68716dfaac5b5ffa3b6655158d5d3472b8c4 minikube.k8s.io/name=false-20220921151637-3535 minikube.k8s.io/updated_at=2022_09_21T15_26_45_0700 minikube.k8s.io/primary=true --all --overwrite --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:46.087744   10389 ops.go:34] apiserver oom_adj: -16
	I0921 15:26:46.087768   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:46.661358   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:47.163233   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:47.661991   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:48.162015   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:46.564586   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I0921 15:26:46.766257   10408 api_server.go:165] Checking apiserver status ...
	I0921 15:26:46.766358   10408 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0921 15:26:46.776615   10408 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/4520/cgroup
	I0921 15:26:46.782756   10408 api_server.go:181] apiserver freezer: "2:freezer:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadc22aaa89e8234f176d6344e50152f4.slice/docker-3a4741e1fe3c0996cab4975bd514e9991794f86cf96c9fe0863c714a6d86e26c.scope"
	I0921 15:26:46.782801   10408 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadc22aaa89e8234f176d6344e50152f4.slice/docker-3a4741e1fe3c0996cab4975bd514e9991794f86cf96c9fe0863c714a6d86e26c.scope/freezer.state
	I0921 15:26:46.789298   10408 api_server.go:203] freezer state: "THAWED"
	I0921 15:26:46.789309   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:51.288815   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": read tcp 192.168.64.1:52998->192.168.64.28:8443: read: connection reset by peer
	I0921 15:26:51.288848   10408 retry.go:31] will retry after 242.214273ms: state is "Stopped"
	I0921 15:26:48.662979   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:49.163023   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:49.662057   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:50.162176   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:50.663300   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:51.162051   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:51.661237   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:52.161318   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:52.663231   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:53.162177   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:51.532207   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:51.632400   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": dial tcp 192.168.64.28:8443: connect: connection refused
	I0921 15:26:51.632425   10408 retry.go:31] will retry after 300.724609ms: state is "Stopped"
	I0921 15:26:51.934415   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:52.035144   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": dial tcp 192.168.64.28:8443: connect: connection refused
	I0921 15:26:52.035176   10408 retry.go:31] will retry after 427.113882ms: state is "Stopped"
	I0921 15:26:52.464328   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:52.566391   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": dial tcp 192.168.64.28:8443: connect: connection refused
	I0921 15:26:52.566426   10408 retry.go:31] will retry after 382.2356ms: state is "Stopped"
	I0921 15:26:52.948987   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:53.049570   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": dial tcp 192.168.64.28:8443: connect: connection refused
	I0921 15:26:53.049605   10408 retry.go:31] will retry after 505.529557ms: state is "Stopped"
	I0921 15:26:53.556334   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:53.658245   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": dial tcp 192.168.64.28:8443: connect: connection refused
	I0921 15:26:53.658268   10408 retry.go:31] will retry after 609.195524ms: state is "Stopped"
	I0921 15:26:54.269593   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:54.371296   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": dial tcp 192.168.64.28:8443: connect: connection refused
	I0921 15:26:54.371340   10408 retry.go:31] will retry after 858.741692ms: state is "Stopped"
	I0921 15:26:55.230116   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:55.331214   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": dial tcp 192.168.64.28:8443: connect: connection refused
	I0921 15:26:55.331251   10408 retry.go:31] will retry after 1.201160326s: state is "Stopped"
	I0921 15:26:53.661186   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:54.163293   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:54.661188   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:55.161203   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:55.661768   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:56.161278   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:56.661209   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:57.161293   10389 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0921 15:26:57.227024   10389 kubeadm.go:1067] duration metric: took 11.320770189s to wait for elevateKubeSystemPrivileges.
	I0921 15:26:57.227047   10389 kubeadm.go:398] StartCluster complete in 28.851048117s
	I0921 15:26:57.227062   10389 settings.go:142] acquiring lock: {Name:mkb00f1de0b91d8f67bd982eab088d27845674b9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0921 15:26:57.227132   10389 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/kubeconfig
	I0921 15:26:57.227768   10389 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/kubeconfig: {Name:mka2f83e1cbd4124ff7179732fbb172d977cf2f4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0921 15:26:57.740783   10389 kapi.go:244] deployment "coredns" in namespace "kube-system" and context "false-20220921151637-3535" rescaled to 1
	I0921 15:26:57.740812   10389 start.go:211] Will wait 5m0s for node &{Name: IP:192.168.64.30 Port:8443 KubernetesVersion:v1.25.2 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0921 15:26:57.740821   10389 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.25.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I0921 15:26:57.740854   10389 addons.go:412] enableAddons start: toEnable=map[], additional=[]
	I0921 15:26:57.740962   10389 config.go:180] Loaded profile config "false-20220921151637-3535": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.2
	I0921 15:26:57.786566   10389 addons.go:65] Setting storage-provisioner=true in profile "false-20220921151637-3535"
	I0921 15:26:57.786585   10389 addons.go:153] Setting addon storage-provisioner=true in "false-20220921151637-3535"
	I0921 15:26:57.786585   10389 addons.go:65] Setting default-storageclass=true in profile "false-20220921151637-3535"
	I0921 15:26:57.786492   10389 out.go:177] * Verifying Kubernetes components...
	W0921 15:26:57.786593   10389 addons.go:162] addon storage-provisioner should already be in state true
	I0921 15:26:57.786605   10389 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "false-20220921151637-3535"
	I0921 15:26:57.786637   10389 host.go:66] Checking if "false-20220921151637-3535" exists ...
	I0921 15:26:57.823578   10389 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0921 15:26:57.824055   10389 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:26:57.824059   10389 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:26:57.824098   10389 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:26:57.824128   10389 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:26:57.831913   10389 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:53008
	I0921 15:26:57.831981   10389 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:53009
	I0921 15:26:57.832340   10389 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:26:57.832352   10389 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:26:57.832684   10389 main.go:134] libmachine: Using API Version  1
	I0921 15:26:57.832694   10389 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:26:57.832700   10389 main.go:134] libmachine: Using API Version  1
	I0921 15:26:57.832713   10389 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:26:57.832896   10389 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:26:57.832944   10389 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:26:57.832993   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetState
	I0921 15:26:57.833084   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0921 15:26:57.833170   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | hyperkit pid from json: 10400
	I0921 15:26:57.833345   10389 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:26:57.833360   10389 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:26:57.839848   10389 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:53012
	I0921 15:26:57.840218   10389 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:26:57.840571   10389 main.go:134] libmachine: Using API Version  1
	I0921 15:26:57.840590   10389 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:26:57.840793   10389 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:26:57.840888   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetState
	I0921 15:26:57.840964   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0921 15:26:57.841057   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | hyperkit pid from json: 10400
	I0921 15:26:57.841584   10389 addons.go:153] Setting addon default-storageclass=true in "false-20220921151637-3535"
	W0921 15:26:57.841596   10389 addons.go:162] addon default-storageclass should already be in state true
	I0921 15:26:57.841612   10389 host.go:66] Checking if "false-20220921151637-3535" exists ...
	I0921 15:26:57.841859   10389 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:26:57.841874   10389 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:26:57.841903   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .DriverName
	I0921 15:26:57.848370   10389 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:53014
	I0921 15:26:57.879837   10389 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I0921 15:26:57.853392   10389 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.25.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.168.64.1 host.minikube.internal\n           fallthrough\n        }' | sudo /var/lib/minikube/binaries/v1.25.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I0921 15:26:57.856801   10389 node_ready.go:35] waiting up to 5m0s for node "false-20220921151637-3535" to be "Ready" ...
	I0921 15:26:57.880708   10389 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:26:57.901652   10389 addons.go:345] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I0921 15:26:57.901674   10389 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I0921 15:26:57.901717   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHHostname
	I0921 15:26:57.902040   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHPort
	I0921 15:26:57.902220   10389 main.go:134] libmachine: Using API Version  1
	I0921 15:26:57.902228   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:57.902244   10389 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:26:57.902481   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHUsername
	I0921 15:26:57.902678   10389 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:26:57.902711   10389 sshutil.go:53] new ssh client: &{IP:192.168.64.30 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/false-20220921151637-3535/id_rsa Username:docker}
	I0921 15:26:57.903323   10389 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:26:57.903348   10389 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:26:57.907923   10389 node_ready.go:49] node "false-20220921151637-3535" has status "Ready":"True"
	I0921 15:26:57.907937   10389 node_ready.go:38] duration metric: took 6.436476ms waiting for node "false-20220921151637-3535" to be "Ready" ...
	I0921 15:26:57.907943   10389 pod_ready.go:35] extra waiting up to 5m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0921 15:26:57.910202   10389 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:53017
	I0921 15:26:57.910546   10389 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:26:57.910873   10389 main.go:134] libmachine: Using API Version  1
	I0921 15:26:57.910889   10389 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:26:57.911076   10389 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:26:57.911170   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetState
	I0921 15:26:57.911256   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0921 15:26:57.911338   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | hyperkit pid from json: 10400
	I0921 15:26:57.912159   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .DriverName
	I0921 15:26:57.912315   10389 addons.go:345] installing /etc/kubernetes/addons/storageclass.yaml
	I0921 15:26:57.912323   10389 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I0921 15:26:57.912331   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHHostname
	I0921 15:26:57.912418   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHPort
	I0921 15:26:57.912497   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHKeyPath
	I0921 15:26:57.912584   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .GetSSHUsername
	I0921 15:26:57.912659   10389 sshutil.go:53] new ssh client: &{IP:192.168.64.30 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/false-20220921151637-3535/id_rsa Username:docker}
	I0921 15:26:57.919652   10389 pod_ready.go:78] waiting up to 5m0s for pod "coredns-565d847f94-pns2v" in "kube-system" namespace to be "Ready" ...
	I0921 15:26:58.008677   10389 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.2/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I0921 15:26:58.015955   10389 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.2/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I0921 15:26:59.137018   10389 ssh_runner.go:235] Completed: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.25.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.168.64.1 host.minikube.internal\n           fallthrough\n        }' | sudo /var/lib/minikube/binaries/v1.25.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -": (1.235523727s)
	I0921 15:26:59.137048   10389 start.go:810] {"host.minikube.internal": 192.168.64.1} host record injected into CoreDNS
	I0921 15:26:59.214166   10389 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.2/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: (1.198193011s)
	I0921 15:26:59.214197   10389 main.go:134] libmachine: Making call to close driver server
	I0921 15:26:59.214212   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .Close
	I0921 15:26:59.214261   10389 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.2/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: (1.205563718s)
	I0921 15:26:59.214276   10389 main.go:134] libmachine: Making call to close driver server
	I0921 15:26:59.214283   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .Close
	I0921 15:26:59.214398   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | Closing plugin on server side
	I0921 15:26:59.214419   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | Closing plugin on server side
	I0921 15:26:59.214438   10389 main.go:134] libmachine: Successfully made call to close driver server
	I0921 15:26:59.214449   10389 main.go:134] libmachine: Making call to close connection to plugin binary
	I0921 15:26:59.214452   10389 main.go:134] libmachine: Successfully made call to close driver server
	I0921 15:26:59.214458   10389 main.go:134] libmachine: Making call to close driver server
	I0921 15:26:59.214464   10389 main.go:134] libmachine: Making call to close connection to plugin binary
	I0921 15:26:59.214465   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .Close
	I0921 15:26:59.214473   10389 main.go:134] libmachine: Making call to close driver server
	I0921 15:26:59.214483   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .Close
	I0921 15:26:59.214582   10389 main.go:134] libmachine: Successfully made call to close driver server
	I0921 15:26:59.214593   10389 main.go:134] libmachine: Making call to close connection to plugin binary
	I0921 15:26:59.214605   10389 main.go:134] libmachine: Making call to close driver server
	I0921 15:26:59.214615   10389 main.go:134] libmachine: (false-20220921151637-3535) Calling .Close
	I0921 15:26:59.214655   10389 main.go:134] libmachine: Successfully made call to close driver server
	I0921 15:26:59.214663   10389 main.go:134] libmachine: Making call to close connection to plugin binary
	I0921 15:26:59.214784   10389 main.go:134] libmachine: Successfully made call to close driver server
	I0921 15:26:59.214810   10389 main.go:134] libmachine: Making call to close connection to plugin binary
	I0921 15:26:59.214847   10389 main.go:134] libmachine: (false-20220921151637-3535) DBG | Closing plugin on server side
	I0921 15:26:59.257530   10389 out.go:177] * Enabled addons: storage-provisioner, default-storageclass
	I0921 15:26:56.533116   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:56.635643   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": dial tcp 192.168.64.28:8443: connect: connection refused
	I0921 15:26:56.635670   10408 retry.go:31] will retry after 1.723796097s: state is "Stopped"
	I0921 15:26:58.359704   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:26:58.461478   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": dial tcp 192.168.64.28:8443: connect: connection refused
	I0921 15:26:58.461505   10408 retry.go:31] will retry after 1.596532639s: state is "Stopped"
	I0921 15:27:00.059136   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:27:00.159945   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": dial tcp 192.168.64.28:8443: connect: connection refused
	I0921 15:27:00.159971   10408 api_server.go:165] Checking apiserver status ...
	I0921 15:27:00.160018   10408 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0921 15:27:00.169632   10408 api_server.go:169] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0921 15:27:00.169647   10408 kubeadm.go:602] needs reconfigure: apiserver error: timed out waiting for the condition
	I0921 15:27:00.169656   10408 kubeadm.go:1114] stopping kube-system containers ...
	I0921 15:27:00.169722   10408 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0921 15:27:00.201882   10408 docker.go:443] Stopping containers: [d7cbc4c453b0 823942ffecb6 283fac289f86 c2e8fe8419a9 4934b6e15931 3a4741e1fe3c e1129956136e 3d0143698c2d 163c82f50ebf 994dd806c8bf eb1318ed7bcc 1a3e01fca571 5fc70456f2e3 54e273754edc 52c58a26f4cc 4ad5f51c22d6 3ac721feff71 bf1833cd9ccb 532325020c06 7d83f8f7d4ba b943e6acece0 25c3a0228e49]
	I0921 15:27:00.201952   10408 ssh_runner.go:195] Run: docker stop d7cbc4c453b0 823942ffecb6 283fac289f86 c2e8fe8419a9 4934b6e15931 3a4741e1fe3c e1129956136e 3d0143698c2d 163c82f50ebf 994dd806c8bf eb1318ed7bcc 1a3e01fca571 5fc70456f2e3 54e273754edc 52c58a26f4cc 4ad5f51c22d6 3ac721feff71 bf1833cd9ccb 532325020c06 7d83f8f7d4ba b943e6acece0 25c3a0228e49
	I0921 15:26:59.279382   10389 addons.go:414] enableAddons completed in 1.538525769s
	I0921 15:26:59.940505   10389 pod_ready.go:102] pod "coredns-565d847f94-pns2v" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:02.438511   10389 pod_ready.go:102] pod "coredns-565d847f94-pns2v" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:05.344188   10408 ssh_runner.go:235] Completed: docker stop d7cbc4c453b0 823942ffecb6 283fac289f86 c2e8fe8419a9 4934b6e15931 3a4741e1fe3c e1129956136e 3d0143698c2d 163c82f50ebf 994dd806c8bf eb1318ed7bcc 1a3e01fca571 5fc70456f2e3 54e273754edc 52c58a26f4cc 4ad5f51c22d6 3ac721feff71 bf1833cd9ccb 532325020c06 7d83f8f7d4ba b943e6acece0 25c3a0228e49: (5.142213633s)
	I0921 15:27:05.344244   10408 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I0921 15:27:05.419551   10408 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I0921 15:27:05.433375   10408 kubeadm.go:155] found existing configuration files:
	-rw------- 1 root root 5643 Sep 21 22:25 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5657 Sep 21 22:25 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 2039 Sep 21 22:25 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5601 Sep 21 22:25 /etc/kubernetes/scheduler.conf
	
	I0921 15:27:05.433432   10408 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I0921 15:27:05.439704   10408 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I0921 15:27:05.445874   10408 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I0921 15:27:05.453215   10408 kubeadm.go:166] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I0921 15:27:05.453270   10408 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I0921 15:27:05.459417   10408 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I0921 15:27:05.465309   10408 kubeadm.go:166] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I0921 15:27:05.465358   10408 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I0921 15:27:05.476008   10408 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I0921 15:27:05.484410   10408 kubeadm.go:704] reconfiguring cluster from /var/tmp/minikube/kubeadm.yaml
	I0921 15:27:05.484426   10408 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.2:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I0921 15:27:05.534434   10408 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.2:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I0921 15:27:04.440960   10389 pod_ready.go:102] pod "coredns-565d847f94-pns2v" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:06.941172   10389 pod_ready.go:102] pod "coredns-565d847f94-pns2v" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:06.469884   10408 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.2:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I0921 15:27:06.628867   10408 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.2:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I0921 15:27:06.698897   10408 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.2:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I0921 15:27:06.759299   10408 api_server.go:51] waiting for apiserver process to appear ...
	I0921 15:27:06.759353   10408 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0921 15:27:06.778540   10408 api_server.go:71] duration metric: took 19.241402ms to wait for apiserver process to appear ...
	I0921 15:27:06.778552   10408 api_server.go:87] waiting for apiserver healthz status ...
	I0921 15:27:06.778559   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:27:09.441803   10389 pod_ready.go:102] pod "coredns-565d847f94-pns2v" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:09.938218   10389 pod_ready.go:97] error getting pod "coredns-565d847f94-pns2v" in "kube-system" namespace (skipping!): pods "coredns-565d847f94-pns2v" not found
	I0921 15:27:09.938237   10389 pod_ready.go:81] duration metric: took 12.018553938s waiting for pod "coredns-565d847f94-pns2v" in "kube-system" namespace to be "Ready" ...
	E0921 15:27:09.938247   10389 pod_ready.go:66] WaitExtra: waitPodCondition: error getting pod "coredns-565d847f94-pns2v" in "kube-system" namespace (skipping!): pods "coredns-565d847f94-pns2v" not found
	I0921 15:27:09.938253   10389 pod_ready.go:78] waiting up to 5m0s for pod "coredns-565d847f94-wwhtk" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:11.950940   10389 pod_ready.go:102] pod "coredns-565d847f94-wwhtk" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:11.780440   10408 api_server.go:256] stopped: https://192.168.64.28:8443/healthz: Get "https://192.168.64.28:8443/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I0921 15:27:12.280518   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:27:14.000183   10408 api_server.go:266] https://192.168.64.28:8443/healthz returned 403:
	{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/healthz\"","reason":"Forbidden","details":{},"code":403}
	W0921 15:27:14.000198   10408 api_server.go:102] status: https://192.168.64.28:8443/healthz returned error 403:
	{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/healthz\"","reason":"Forbidden","details":{},"code":403}
	I0921 15:27:14.282668   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:27:14.289281   10408 api_server.go:266] https://192.168.64.28:8443/healthz returned 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/bootstrap-controller ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	W0921 15:27:14.289293   10408 api_server.go:102] status: https://192.168.64.28:8443/healthz returned error 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/bootstrap-controller ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	I0921 15:27:14.780762   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:27:14.786529   10408 api_server.go:266] https://192.168.64.28:8443/healthz returned 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/bootstrap-controller ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	W0921 15:27:14.786540   10408 api_server.go:102] status: https://192.168.64.28:8443/healthz returned error 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/bootstrap-controller ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	I0921 15:27:15.280930   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:27:15.288106   10408 api_server.go:266] https://192.168.64.28:8443/healthz returned 200:
	ok
	I0921 15:27:15.292969   10408 api_server.go:140] control plane version: v1.25.2
	I0921 15:27:15.292981   10408 api_server.go:130] duration metric: took 8.514415313s to wait for apiserver health ...
	I0921 15:27:15.292986   10408 cni.go:95] Creating CNI manager for ""
	I0921 15:27:15.292994   10408 cni.go:169] CNI unnecessary in this configuration, recommending no CNI
	I0921 15:27:15.293004   10408 system_pods.go:43] waiting for kube-system pods to appear ...
	I0921 15:27:15.298309   10408 system_pods.go:59] 6 kube-system pods found
	I0921 15:27:15.298324   10408 system_pods.go:61] "coredns-565d847f94-9wtnp" [eb8f3bae-6107-4a2b-ba32-d79405830bf0] Running
	I0921 15:27:15.298330   10408 system_pods.go:61] "etcd-pause-20220921152522-3535" [17c2d77b-b921-47a8-9a13-17620d5b88c8] Running / Ready:ContainersNotReady (containers with unready status: [etcd]) / ContainersReady:ContainersNotReady (containers with unready status: [etcd])
	I0921 15:27:15.298335   10408 system_pods.go:61] "kube-apiserver-pause-20220921152522-3535" [0e89e308-e699-430a-9feb-d0b972291f03] Running / Ready:ContainersNotReady (containers with unready status: [kube-apiserver]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-apiserver])
	I0921 15:27:15.298340   10408 system_pods.go:61] "kube-controller-manager-pause-20220921152522-3535" [1e9f7576-ef69-4d06-b19d-0cf5fb9d0471] Running / Ready:ContainersNotReady (containers with unready status: [kube-controller-manager]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-controller-manager])
	I0921 15:27:15.298344   10408 system_pods.go:61] "kube-proxy-5c7jc" [1c5b06ea-f4c2-45b9-a80e-d85983bb3282] Running
	I0921 15:27:15.298348   10408 system_pods.go:61] "kube-scheduler-pause-20220921152522-3535" [cb32a64b-32f0-46e6-8f1c-f2a3460c5fbb] Running / Ready:ContainersNotReady (containers with unready status: [kube-scheduler]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-scheduler])
	I0921 15:27:15.298352   10408 system_pods.go:74] duration metric: took 5.344262ms to wait for pod list to return data ...
	I0921 15:27:15.298357   10408 node_conditions.go:102] verifying NodePressure condition ...
	I0921 15:27:15.300304   10408 node_conditions.go:122] node storage ephemeral capacity is 17784752Ki
	I0921 15:27:15.300319   10408 node_conditions.go:123] node cpu capacity is 2
	I0921 15:27:15.300328   10408 node_conditions.go:105] duration metric: took 1.967816ms to run NodePressure ...
	I0921 15:27:15.300342   10408 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.2:$PATH" kubeadm init phase addon all --config /var/tmp/minikube/kubeadm.yaml"
	I0921 15:27:15.402185   10408 kubeadm.go:763] waiting for restarted kubelet to initialise ...
	I0921 15:27:15.405062   10408 kubeadm.go:778] kubelet initialised
	I0921 15:27:15.405072   10408 kubeadm.go:779] duration metric: took 2.873657ms waiting for restarted kubelet to initialise ...
	I0921 15:27:15.405080   10408 pod_ready.go:35] extra waiting up to 4m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0921 15:27:15.408132   10408 pod_ready.go:78] waiting up to 4m0s for pod "coredns-565d847f94-9wtnp" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:15.411452   10408 pod_ready.go:92] pod "coredns-565d847f94-9wtnp" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:15.411459   10408 pod_ready.go:81] duration metric: took 3.317632ms waiting for pod "coredns-565d847f94-9wtnp" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:15.411465   10408 pod_ready.go:78] waiting up to 4m0s for pod "etcd-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:14.445892   10389 pod_ready.go:102] pod "coredns-565d847f94-wwhtk" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:16.945831   10389 pod_ready.go:102] pod "coredns-565d847f94-wwhtk" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:17.420289   10408 pod_ready.go:102] pod "etcd-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:19.421503   10408 pod_ready.go:102] pod "etcd-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:18.946719   10389 pod_ready.go:102] pod "coredns-565d847f94-wwhtk" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:20.947256   10389 pod_ready.go:102] pod "coredns-565d847f94-wwhtk" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:22.950309   10389 pod_ready.go:102] pod "coredns-565d847f94-wwhtk" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:21.919889   10408 pod_ready.go:102] pod "etcd-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:24.419226   10408 pod_ready.go:102] pod "etcd-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:25.920028   10408 pod_ready.go:92] pod "etcd-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:25.920043   10408 pod_ready.go:81] duration metric: took 10.508561161s waiting for pod "etcd-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:25.920049   10408 pod_ready.go:78] waiting up to 4m0s for pod "kube-apiserver-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:25.923063   10408 pod_ready.go:92] pod "kube-apiserver-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:25.923071   10408 pod_ready.go:81] duration metric: took 3.017613ms waiting for pod "kube-apiserver-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:25.923077   10408 pod_ready.go:78] waiting up to 4m0s for pod "kube-controller-manager-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:25.926284   10408 pod_ready.go:92] pod "kube-controller-manager-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:25.926292   10408 pod_ready.go:81] duration metric: took 3.20987ms waiting for pod "kube-controller-manager-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:25.926297   10408 pod_ready.go:78] waiting up to 4m0s for pod "kube-proxy-5c7jc" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:25.929448   10408 pod_ready.go:92] pod "kube-proxy-5c7jc" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:25.929456   10408 pod_ready.go:81] duration metric: took 3.154194ms waiting for pod "kube-proxy-5c7jc" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:25.929461   10408 pod_ready.go:78] waiting up to 4m0s for pod "kube-scheduler-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:25.932599   10408 pod_ready.go:92] pod "kube-scheduler-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:25.932606   10408 pod_ready.go:81] duration metric: took 3.140486ms waiting for pod "kube-scheduler-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:25.932610   10408 pod_ready.go:38] duration metric: took 10.527510396s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0921 15:27:25.932619   10408 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0921 15:27:25.939997   10408 ops.go:34] apiserver oom_adj: -16
	I0921 15:27:25.940008   10408 kubeadm.go:631] restartCluster took 55.116747244s
	I0921 15:27:25.940013   10408 kubeadm.go:398] StartCluster complete in 55.154103553s
	I0921 15:27:25.940027   10408 settings.go:142] acquiring lock: {Name:mkb00f1de0b91d8f67bd982eab088d27845674b9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0921 15:27:25.940102   10408 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/kubeconfig
	I0921 15:27:25.941204   10408 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/kubeconfig: {Name:mka2f83e1cbd4124ff7179732fbb172d977cf2f4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0921 15:27:25.942042   10408 kapi.go:59] client config for pause-20220921152522-3535: &rest.Config{Host:"https://192.168.64.28:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/client.crt", KeyFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/cl
ient.key", CAFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x233b400), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0921 15:27:25.944188   10408 kapi.go:244] deployment "coredns" in namespace "kube-system" and context "pause-20220921152522-3535" rescaled to 1
	I0921 15:27:25.944221   10408 start.go:211] Will wait 6m0s for node &{Name: IP:192.168.64.28 Port:8443 KubernetesVersion:v1.25.2 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0921 15:27:25.944255   10408 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.25.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I0921 15:27:25.944277   10408 addons.go:412] enableAddons start: toEnable=map[], additional=[]
	I0921 15:27:25.944378   10408 config.go:180] Loaded profile config "pause-20220921152522-3535": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.2
	I0921 15:27:25.967437   10408 addons.go:65] Setting storage-provisioner=true in profile "pause-20220921152522-3535"
	I0921 15:27:25.967440   10408 addons.go:65] Setting default-storageclass=true in profile "pause-20220921152522-3535"
	I0921 15:27:25.967359   10408 out.go:177] * Verifying Kubernetes components...
	I0921 15:27:25.967453   10408 addons.go:153] Setting addon storage-provisioner=true in "pause-20220921152522-3535"
	I0921 15:27:25.967457   10408 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "pause-20220921152522-3535"
	W0921 15:27:25.967460   10408 addons.go:162] addon storage-provisioner should already be in state true
	I0921 15:27:26.012377   10408 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0921 15:27:26.012436   10408 host.go:66] Checking if "pause-20220921152522-3535" exists ...
	I0921 15:27:26.012762   10408 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:27:26.012761   10408 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:27:26.012794   10408 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:27:26.012829   10408 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:27:26.019897   10408 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:53028
	I0921 15:27:26.020028   10408 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:53029
	I0921 15:27:26.020328   10408 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:27:26.020394   10408 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:27:26.020706   10408 main.go:134] libmachine: Using API Version  1
	I0921 15:27:26.020719   10408 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:27:26.020801   10408 main.go:134] libmachine: Using API Version  1
	I0921 15:27:26.020817   10408 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:27:26.020929   10408 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:27:26.021015   10408 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:27:26.021115   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetState
	I0921 15:27:26.021203   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0921 15:27:26.021283   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | hyperkit pid from json: 10295
	I0921 15:27:26.021419   10408 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:27:26.021443   10408 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:27:26.023750   10408 kapi.go:59] client config for pause-20220921152522-3535: &rest.Config{Host:"https://192.168.64.28:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/client.crt", KeyFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/pause-20220921152522-3535/cl
ient.key", CAFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x233b400), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0921 15:27:26.027574   10408 addons.go:153] Setting addon default-storageclass=true in "pause-20220921152522-3535"
	W0921 15:27:26.027587   10408 addons.go:162] addon default-storageclass should already be in state true
	I0921 15:27:26.027606   10408 host.go:66] Checking if "pause-20220921152522-3535" exists ...
	I0921 15:27:26.027788   10408 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:53032
	I0921 15:27:26.027854   10408 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:27:26.027880   10408 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:27:26.028560   10408 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:27:26.029753   10408 main.go:134] libmachine: Using API Version  1
	I0921 15:27:26.029767   10408 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:27:26.030003   10408 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:27:26.030113   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetState
	I0921 15:27:26.030207   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0921 15:27:26.030282   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | hyperkit pid from json: 10295
	I0921 15:27:26.031135   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:27:26.034331   10408 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:53034
	I0921 15:27:26.055199   10408 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I0921 15:27:26.038435   10408 node_ready.go:35] waiting up to 6m0s for node "pause-20220921152522-3535" to be "Ready" ...
	I0921 15:27:26.038466   10408 start.go:790] CoreDNS already contains "host.minikube.internal" host record, skipping...
	I0921 15:27:26.055642   10408 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:27:26.075151   10408 addons.go:345] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I0921 15:27:26.075161   10408 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I0921 15:27:26.075184   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:27:26.075306   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:27:26.075441   10408 main.go:134] libmachine: Using API Version  1
	I0921 15:27:26.075451   10408 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:27:26.075455   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:27:26.075546   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:27:26.075643   10408 sshutil.go:53] new ssh client: &{IP:192.168.64.28 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/pause-20220921152522-3535/id_rsa Username:docker}
	I0921 15:27:26.075669   10408 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:27:26.076075   10408 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:27:26.076097   10408 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:27:26.082485   10408 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:53037
	I0921 15:27:26.082858   10408 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:27:26.083217   10408 main.go:134] libmachine: Using API Version  1
	I0921 15:27:26.083234   10408 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:27:26.083443   10408 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:27:26.083534   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetState
	I0921 15:27:26.083608   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0921 15:27:26.083699   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | hyperkit pid from json: 10295
	I0921 15:27:26.084503   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .DriverName
	I0921 15:27:26.084648   10408 addons.go:345] installing /etc/kubernetes/addons/storageclass.yaml
	I0921 15:27:26.084657   10408 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I0921 15:27:26.084665   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHHostname
	I0921 15:27:26.084734   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHPort
	I0921 15:27:26.084830   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHKeyPath
	I0921 15:27:26.084916   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .GetSSHUsername
	I0921 15:27:26.085010   10408 sshutil.go:53] new ssh client: &{IP:192.168.64.28 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/pause-20220921152522-3535/id_rsa Username:docker}
	I0921 15:27:26.117393   10408 node_ready.go:49] node "pause-20220921152522-3535" has status "Ready":"True"
	I0921 15:27:26.117403   10408 node_ready.go:38] duration metric: took 42.373374ms waiting for node "pause-20220921152522-3535" to be "Ready" ...
	I0921 15:27:26.117410   10408 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0921 15:27:26.127239   10408 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.2/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I0921 15:27:26.137634   10408 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.2/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I0921 15:27:26.319821   10408 pod_ready.go:78] waiting up to 6m0s for pod "coredns-565d847f94-9wtnp" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:26.697611   10408 main.go:134] libmachine: Making call to close driver server
	I0921 15:27:26.697627   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .Close
	I0921 15:27:26.697784   10408 main.go:134] libmachine: Successfully made call to close driver server
	I0921 15:27:26.697793   10408 main.go:134] libmachine: Making call to close connection to plugin binary
	I0921 15:27:26.697804   10408 main.go:134] libmachine: Making call to close driver server
	I0921 15:27:26.697809   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .Close
	I0921 15:27:26.697836   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | Closing plugin on server side
	I0921 15:27:26.697938   10408 main.go:134] libmachine: Successfully made call to close driver server
	I0921 15:27:26.697946   10408 main.go:134] libmachine: Making call to close connection to plugin binary
	I0921 15:27:26.697962   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | Closing plugin on server side
	I0921 15:27:26.712622   10408 main.go:134] libmachine: Making call to close driver server
	I0921 15:27:26.712636   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .Close
	I0921 15:27:26.712825   10408 main.go:134] libmachine: Successfully made call to close driver server
	I0921 15:27:26.712834   10408 main.go:134] libmachine: Making call to close connection to plugin binary
	I0921 15:27:26.712839   10408 main.go:134] libmachine: Making call to close driver server
	I0921 15:27:26.712844   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | Closing plugin on server side
	I0921 15:27:26.712846   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .Close
	I0921 15:27:26.712954   10408 main.go:134] libmachine: Successfully made call to close driver server
	I0921 15:27:26.712962   10408 main.go:134] libmachine: Making call to close connection to plugin binary
	I0921 15:27:26.712969   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | Closing plugin on server side
	I0921 15:27:26.712973   10408 main.go:134] libmachine: Making call to close driver server
	I0921 15:27:26.712981   10408 main.go:134] libmachine: (pause-20220921152522-3535) Calling .Close
	I0921 15:27:26.713114   10408 main.go:134] libmachine: Successfully made call to close driver server
	I0921 15:27:26.713128   10408 main.go:134] libmachine: Making call to close connection to plugin binary
	I0921 15:27:26.713142   10408 main.go:134] libmachine: (pause-20220921152522-3535) DBG | Closing plugin on server side
	I0921 15:27:26.735926   10408 out.go:177] * Enabled addons: storage-provisioner, default-storageclass
	I0921 15:27:25.446939   10389 pod_ready.go:102] pod "coredns-565d847f94-wwhtk" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:27.947781   10389 pod_ready.go:102] pod "coredns-565d847f94-wwhtk" in "kube-system" namespace has status "Ready":"False"
	I0921 15:27:26.773142   10408 addons.go:414] enableAddons completed in 828.831417ms
	I0921 15:27:26.776027   10408 pod_ready.go:92] pod "coredns-565d847f94-9wtnp" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:26.776040   10408 pod_ready.go:81] duration metric: took 456.205251ms waiting for pod "coredns-565d847f94-9wtnp" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:26.776049   10408 pod_ready.go:78] waiting up to 6m0s for pod "etcd-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:27.117622   10408 pod_ready.go:92] pod "etcd-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:27.117632   10408 pod_ready.go:81] duration metric: took 341.577773ms waiting for pod "etcd-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:27.117638   10408 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:27.518637   10408 pod_ready.go:92] pod "kube-apiserver-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:27.518650   10408 pod_ready.go:81] duration metric: took 401.006674ms waiting for pod "kube-apiserver-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:27.518660   10408 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:27.918763   10408 pod_ready.go:92] pod "kube-controller-manager-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:27.918778   10408 pod_ready.go:81] duration metric: took 400.10892ms waiting for pod "kube-controller-manager-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:27.918787   10408 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-5c7jc" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:28.318657   10408 pod_ready.go:92] pod "kube-proxy-5c7jc" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:28.318670   10408 pod_ready.go:81] duration metric: took 399.877205ms waiting for pod "kube-proxy-5c7jc" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:28.318678   10408 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:28.720230   10408 pod_ready.go:92] pod "kube-scheduler-pause-20220921152522-3535" in "kube-system" namespace has status "Ready":"True"
	I0921 15:27:28.720243   10408 pod_ready.go:81] duration metric: took 401.55845ms waiting for pod "kube-scheduler-pause-20220921152522-3535" in "kube-system" namespace to be "Ready" ...
	I0921 15:27:28.720250   10408 pod_ready.go:38] duration metric: took 2.602830576s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0921 15:27:28.720263   10408 api_server.go:51] waiting for apiserver process to appear ...
	I0921 15:27:28.720316   10408 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0921 15:27:28.729887   10408 api_server.go:71] duration metric: took 2.78564504s to wait for apiserver process to appear ...
	I0921 15:27:28.729899   10408 api_server.go:87] waiting for apiserver healthz status ...
	I0921 15:27:28.729905   10408 api_server.go:240] Checking apiserver healthz at https://192.168.64.28:8443/healthz ...
	I0921 15:27:28.733744   10408 api_server.go:266] https://192.168.64.28:8443/healthz returned 200:
	ok
	I0921 15:27:28.734313   10408 api_server.go:140] control plane version: v1.25.2
	I0921 15:27:28.734323   10408 api_server.go:130] duration metric: took 4.419338ms to wait for apiserver health ...
	I0921 15:27:28.734328   10408 system_pods.go:43] waiting for kube-system pods to appear ...
	I0921 15:27:28.920241   10408 system_pods.go:59] 7 kube-system pods found
	I0921 15:27:28.920257   10408 system_pods.go:61] "coredns-565d847f94-9wtnp" [eb8f3bae-6107-4a2b-ba32-d79405830bf0] Running
	I0921 15:27:28.920261   10408 system_pods.go:61] "etcd-pause-20220921152522-3535" [17c2d77b-b921-47a8-9a13-17620d5b88c8] Running
	I0921 15:27:28.920274   10408 system_pods.go:61] "kube-apiserver-pause-20220921152522-3535" [0e89e308-e699-430a-9feb-d0b972291f03] Running
	I0921 15:27:28.920279   10408 system_pods.go:61] "kube-controller-manager-pause-20220921152522-3535" [1e9f7576-ef69-4d06-b19d-0cf5fb9d0471] Running
	I0921 15:27:28.920283   10408 system_pods.go:61] "kube-proxy-5c7jc" [1c5b06ea-f4c2-45b9-a80e-d85983bb3282] Running
	I0921 15:27:28.920286   10408 system_pods.go:61] "kube-scheduler-pause-20220921152522-3535" [cb32a64b-32f0-46e6-8f1c-f2a3460c5fbb] Running
	I0921 15:27:28.920289   10408 system_pods.go:61] "storage-provisioner" [f71f00f0-f421-45c2-bfe4-c1e99f11b8e5] Running
	I0921 15:27:28.920294   10408 system_pods.go:74] duration metric: took 185.961163ms to wait for pod list to return data ...
	I0921 15:27:28.920300   10408 default_sa.go:34] waiting for default service account to be created ...
	I0921 15:27:29.119704   10408 default_sa.go:45] found service account: "default"
	I0921 15:27:29.119720   10408 default_sa.go:55] duration metric: took 199.41576ms for default service account to be created ...
	I0921 15:27:29.119727   10408 system_pods.go:116] waiting for k8s-apps to be running ...
	I0921 15:27:29.322362   10408 system_pods.go:86] 7 kube-system pods found
	I0921 15:27:29.322375   10408 system_pods.go:89] "coredns-565d847f94-9wtnp" [eb8f3bae-6107-4a2b-ba32-d79405830bf0] Running
	I0921 15:27:29.322379   10408 system_pods.go:89] "etcd-pause-20220921152522-3535" [17c2d77b-b921-47a8-9a13-17620d5b88c8] Running
	I0921 15:27:29.322383   10408 system_pods.go:89] "kube-apiserver-pause-20220921152522-3535" [0e89e308-e699-430a-9feb-d0b972291f03] Running
	I0921 15:27:29.322388   10408 system_pods.go:89] "kube-controller-manager-pause-20220921152522-3535" [1e9f7576-ef69-4d06-b19d-0cf5fb9d0471] Running
	I0921 15:27:29.322391   10408 system_pods.go:89] "kube-proxy-5c7jc" [1c5b06ea-f4c2-45b9-a80e-d85983bb3282] Running
	I0921 15:27:29.322395   10408 system_pods.go:89] "kube-scheduler-pause-20220921152522-3535" [cb32a64b-32f0-46e6-8f1c-f2a3460c5fbb] Running
	I0921 15:27:29.322398   10408 system_pods.go:89] "storage-provisioner" [f71f00f0-f421-45c2-bfe4-c1e99f11b8e5] Running
	I0921 15:27:29.322402   10408 system_pods.go:126] duration metric: took 202.671392ms to wait for k8s-apps to be running ...
	I0921 15:27:29.322407   10408 system_svc.go:44] waiting for kubelet service to be running ....
	I0921 15:27:29.322452   10408 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0921 15:27:29.331792   10408 system_svc.go:56] duration metric: took 9.381149ms WaitForService to wait for kubelet.
	I0921 15:27:29.331804   10408 kubeadm.go:573] duration metric: took 3.387565971s to wait for : map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] ...
	I0921 15:27:29.331823   10408 node_conditions.go:102] verifying NodePressure condition ...
	I0921 15:27:29.518084   10408 node_conditions.go:122] node storage ephemeral capacity is 17784752Ki
	I0921 15:27:29.518100   10408 node_conditions.go:123] node cpu capacity is 2
	I0921 15:27:29.518105   10408 node_conditions.go:105] duration metric: took 186.278888ms to run NodePressure ...
	I0921 15:27:29.518113   10408 start.go:216] waiting for startup goroutines ...
	I0921 15:27:29.551427   10408 start.go:506] kubectl: 1.25.0, cluster: 1.25.2 (minor skew: 0)
	I0921 15:27:29.611327   10408 out.go:177] * Done! kubectl is now configured to use "pause-20220921152522-3535" cluster and "default" namespace by default
	
	* 
	* ==> Docker <==
	* -- Journal begins at Wed 2022-09-21 22:25:29 UTC, ends at Wed 2022-09-21 22:27:33 UTC. --
	Sep 21 22:27:07 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:07.405457988Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/64651e97bf148aa1e9fbcad6bfbec4d1e8535ad920f0d5c47cd57190f6804445 pid=5990 runtime=io.containerd.runc.v2
	Sep 21 22:27:07 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:07.406210133Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Sep 21 22:27:07 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:07.406245445Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Sep 21 22:27:07 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:07.406253448Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Sep 21 22:27:07 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:07.406435610Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/207eee071672f5cc181475db6e621afacd6722bc026b03a3b344ad50e1cefc78 pid=5992 runtime=io.containerd.runc.v2
	Sep 21 22:27:07 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:07.422862395Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Sep 21 22:27:07 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:07.422958571Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Sep 21 22:27:07 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:07.422967730Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Sep 21 22:27:07 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:07.423253250Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/534b0d7cd88d7c2d979cc7e5c6eb29977494de71ff82fec3d02420ecb80a30b9 pid=6024 runtime=io.containerd.runc.v2
	Sep 21 22:27:15 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:15.785293775Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Sep 21 22:27:15 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:15.785363542Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Sep 21 22:27:15 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:15.785372748Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Sep 21 22:27:15 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:15.785536470Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/1650473a18ef5642e63da9873326d2ed8d331ce75d182aaf5834afe35d8f1c48 pid=6217 runtime=io.containerd.runc.v2
	Sep 21 22:27:16 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:16.098886881Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Sep 21 22:27:16 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:16.098975354Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Sep 21 22:27:16 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:16.098986289Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Sep 21 22:27:16 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:16.099142849Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/152338a53f1e4e1033c391833e8d6cba34a8c41caa549b9524e155354c7edd68 pid=6265 runtime=io.containerd.runc.v2
	Sep 21 22:27:27 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:27.192601808Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Sep 21 22:27:27 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:27.192670528Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Sep 21 22:27:27 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:27.192679056Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Sep 21 22:27:27 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:27.192948353Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/c41fc7d463dbce833eb22fe2cbe7272c863767af9f5ce4eb37b36c8efa33b012 pid=6532 runtime=io.containerd.runc.v2
	Sep 21 22:27:27 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:27.493268572Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Sep 21 22:27:27 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:27.493331709Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Sep 21 22:27:27 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:27.493341289Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Sep 21 22:27:27 pause-20220921152522-3535 dockerd[3700]: time="2022-09-21T22:27:27.493781950Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/e6a3aeef0ff7cec28ea93bae81a53252f4adbfe81f9da2e64add46df53fa77f2 pid=6573 runtime=io.containerd.runc.v2
	
	* 
	* ==> container status <==
	* CONTAINER           IMAGE               CREATED              STATE               NAME                      ATTEMPT             POD ID
	e6a3aeef0ff7c       6e38f40d628db       6 seconds ago        Running             storage-provisioner       0                   c41fc7d463dbc
	152338a53f1e4       1c7d8c51823b5       17 seconds ago       Running             kube-proxy                3                   f67bd5c5d43e1
	1650473a18ef5       5185b96f0becf       18 seconds ago       Running             coredns                   2                   92cc25df1c118
	64651e97bf148       a8a176a5d5d69       26 seconds ago       Running             etcd                      3                   0249ca0da9611
	207eee071672f       ca0ea1ee3cfd3       26 seconds ago       Running             kube-scheduler            3                   522a493620409
	534b0d7cd88d7       dbfceb93c69b6       26 seconds ago       Running             kube-controller-manager   3                   f60c5ce6318fc
	b6d4531497f33       97801f8394908       31 seconds ago       Running             kube-apiserver            3                   0ca250926532e
	d7cbc4c453b05       ca0ea1ee3cfd3       42 seconds ago       Exited              kube-scheduler            2                   1a3e01fca5715
	823942ffecb6f       dbfceb93c69b6       45 seconds ago       Exited              kube-controller-manager   2                   e1129956136e0
	283fac289f860       a8a176a5d5d69       46 seconds ago       Exited              etcd                      2                   eb1318ed7bcc9
	c2e8fe8419a96       1c7d8c51823b5       47 seconds ago       Exited              kube-proxy                2                   994dd806c8bfd
	4934b6e15931f       5185b96f0becf       About a minute ago   Exited              coredns                   1                   163c82f50ebf1
	3a4741e1fe3c0       97801f8394908       About a minute ago   Exited              kube-apiserver            2                   3d0143698c2dc
	
	* 
	* ==> coredns [1650473a18ef] <==
	* .:53
	[INFO] plugin/reload: Running configuration SHA512 = 7135f430aea492809ab227b028bd16c96f6629e00404d9ec4f44cae029eb3743d1cfe4a9d0cc8fbbd4cfa53556972f2bbf615e7c9e8412e85d290539257166ad
	CoreDNS-1.9.3
	linux/amd64, go1.18.2, 45b0a11
	
	* 
	* ==> coredns [4934b6e15931] <==
	* [INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 7135f430aea492809ab227b028bd16c96f6629e00404d9ec4f44cae029eb3743d1cfe4a9d0cc8fbbd4cfa53556972f2bbf615e7c9e8412e85d290539257166ad
	CoreDNS-1.9.3
	linux/amd64, go1.18.2, 45b0a11
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[WARNING] plugin/kubernetes: Kubernetes API connection failure: Get "https://10.96.0.1:443/version": net/http: TLS handshake timeout
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	[WARNING] plugin/kubernetes: Kubernetes API connection failure: Get "https://10.96.0.1:443/version": dial tcp 10.96.0.1:443: connect: network is unreachable
	
	* 
	* ==> describe nodes <==
	* Name:               pause-20220921152522-3535
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=pause-20220921152522-3535
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=937c68716dfaac5b5ffa3b6655158d5d3472b8c4
	                    minikube.k8s.io/name=pause-20220921152522-3535
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2022_09_21T15_25_59_0700
	                    minikube.k8s.io/version=v1.27.0
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 21 Sep 2022 22:25:58 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  pause-20220921152522-3535
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 21 Sep 2022 22:27:24 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 21 Sep 2022 22:27:14 +0000   Wed, 21 Sep 2022 22:25:58 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 21 Sep 2022 22:27:14 +0000   Wed, 21 Sep 2022 22:25:58 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 21 Sep 2022 22:27:14 +0000   Wed, 21 Sep 2022 22:25:58 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 21 Sep 2022 22:27:14 +0000   Wed, 21 Sep 2022 22:26:09 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.64.28
	  Hostname:    pause-20220921152522-3535
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17784752Ki
	  hugepages-2Mi:      0
	  memory:             2017572Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17784752Ki
	  hugepages-2Mi:      0
	  memory:             2017572Ki
	  pods:               110
	System Info:
	  Machine ID:                 0962272db386446fb19d5815e48c70e2
	  System UUID:                485511ed-0000-0000-82c9-149d997fca88
	  Boot ID:                    e52786ed-2040-47a8-9190-c9c808b4a98b
	  Kernel Version:             5.10.57
	  OS Image:                   Buildroot 2021.02.12
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://20.10.18
	  Kubelet Version:            v1.25.2
	  Kube-Proxy Version:         v1.25.2
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (7 in total)
	  Namespace                   Name                                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                                 ------------  ----------  ---------------  -------------  ---
	  kube-system                 coredns-565d847f94-9wtnp                             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     83s
	  kube-system                 etcd-pause-20220921152522-3535                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (5%!)(MISSING)       0 (0%!)(MISSING)         95s
	  kube-system                 kube-apiserver-pause-20220921152522-3535             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         95s
	  kube-system                 kube-controller-manager-pause-20220921152522-3535    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         95s
	  kube-system                 kube-proxy-5c7jc                                     0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         83s
	  kube-system                 kube-scheduler-pause-20220921152522-3535             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         95s
	  kube-system                 storage-provisioner                                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         8s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  0 (0%!)(MISSING)
	  memory             170Mi (8%!)(MISSING)  170Mi (8%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                  From             Message
	  ----    ------                   ----                 ----             -------
	  Normal  Starting                 81s                  kube-proxy       
	  Normal  Starting                 17s                  kube-proxy       
	  Normal  Starting                 67s                  kube-proxy       
	  Normal  NodeHasSufficientPID     109s (x5 over 109s)  kubelet          Node pause-20220921152522-3535 status is now: NodeHasSufficientPID
	  Normal  NodeHasNoDiskPressure    109s (x6 over 109s)  kubelet          Node pause-20220921152522-3535 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientMemory  109s (x6 over 109s)  kubelet          Node pause-20220921152522-3535 status is now: NodeHasSufficientMemory
	  Normal  Starting                 95s                  kubelet          Starting kubelet.
	  Normal  NodeAllocatableEnforced  95s                  kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientMemory  95s                  kubelet          Node pause-20220921152522-3535 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    95s                  kubelet          Node pause-20220921152522-3535 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     95s                  kubelet          Node pause-20220921152522-3535 status is now: NodeHasSufficientPID
	  Normal  NodeReady                85s                  kubelet          Node pause-20220921152522-3535 status is now: NodeReady
	  Normal  RegisteredNode           83s                  node-controller  Node pause-20220921152522-3535 event: Registered Node pause-20220921152522-3535 in Controller
	  Normal  Starting                 28s                  kubelet          Starting kubelet.
	  Normal  NodeAllocatableEnforced  28s                  kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientMemory  27s (x8 over 28s)    kubelet          Node pause-20220921152522-3535 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    27s (x8 over 28s)    kubelet          Node pause-20220921152522-3535 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     27s (x7 over 28s)    kubelet          Node pause-20220921152522-3535 status is now: NodeHasSufficientPID
	  Normal  RegisteredNode           7s                   node-controller  Node pause-20220921152522-3535 event: Registered Node pause-20220921152522-3535 in Controller
	
	* 
	* ==> dmesg <==
	* [  +0.000001] systemd[1]: (This warning is only shown for the first unit using IP firewalling.)
	[  +1.836758] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000004] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000000] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +0.731337] systemd-fstab-generator[530]: Ignoring "noauto" for root device
	[  +0.090984] systemd-fstab-generator[541]: Ignoring "noauto" for root device
	[  +5.027202] systemd-fstab-generator[762]: Ignoring "noauto" for root device
	[  +1.197234] kauditd_printk_skb: 16 callbacks suppressed
	[  +0.214769] systemd-fstab-generator[921]: Ignoring "noauto" for root device
	[  +0.091300] systemd-fstab-generator[932]: Ignoring "noauto" for root device
	[  +0.097321] systemd-fstab-generator[943]: Ignoring "noauto" for root device
	[  +1.296604] systemd-fstab-generator[1093]: Ignoring "noauto" for root device
	[  +0.087737] systemd-fstab-generator[1104]: Ignoring "noauto" for root device
	[  +3.910315] systemd-fstab-generator[1322]: Ignoring "noauto" for root device
	[  +0.546371] kauditd_printk_skb: 68 callbacks suppressed
	[ +13.692006] systemd-fstab-generator[1995]: Ignoring "noauto" for root device
	[Sep21 22:26] kauditd_printk_skb: 8 callbacks suppressed
	[  +8.344097] systemd-fstab-generator[2768]: Ignoring "noauto" for root device
	[  +0.136976] systemd-fstab-generator[2779]: Ignoring "noauto" for root device
	[  +0.134278] systemd-fstab-generator[2790]: Ignoring "noauto" for root device
	[  +0.497533] kauditd_printk_skb: 17 callbacks suppressed
	[  +7.690771] systemd-fstab-generator[4167]: Ignoring "noauto" for root device
	[  +0.127432] systemd-fstab-generator[4182]: Ignoring "noauto" for root device
	[ +31.144308] kauditd_printk_skb: 34 callbacks suppressed
	[Sep21 22:27] systemd-fstab-generator[5830]: Ignoring "noauto" for root device
	
	* 
	* ==> etcd [283fac289f86] <==
	* {"level":"info","ts":"2022-09-21T22:26:47.976Z","caller":"embed/etcd.go:763","msg":"serving metrics","address":"http://127.0.0.1:2381"}
	{"level":"info","ts":"2022-09-21T22:26:47.976Z","caller":"embed/etcd.go:581","msg":"serving peer traffic","address":"192.168.64.28:2380"}
	{"level":"info","ts":"2022-09-21T22:26:47.976Z","caller":"embed/etcd.go:553","msg":"cmux::serve","address":"192.168.64.28:2380"}
	{"level":"info","ts":"2022-09-21T22:26:49.366Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"d3378a43e4252963 is starting a new election at term 3"}
	{"level":"info","ts":"2022-09-21T22:26:49.367Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"d3378a43e4252963 became pre-candidate at term 3"}
	{"level":"info","ts":"2022-09-21T22:26:49.367Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"d3378a43e4252963 received MsgPreVoteResp from d3378a43e4252963 at term 3"}
	{"level":"info","ts":"2022-09-21T22:26:49.367Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"d3378a43e4252963 became candidate at term 4"}
	{"level":"info","ts":"2022-09-21T22:26:49.367Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"d3378a43e4252963 received MsgVoteResp from d3378a43e4252963 at term 4"}
	{"level":"info","ts":"2022-09-21T22:26:49.367Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"d3378a43e4252963 became leader at term 4"}
	{"level":"info","ts":"2022-09-21T22:26:49.367Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: d3378a43e4252963 elected leader d3378a43e4252963 at term 4"}
	{"level":"info","ts":"2022-09-21T22:26:49.367Z","caller":"etcdserver/server.go:2042","msg":"published local member to cluster through raft","local-member-id":"d3378a43e4252963","local-member-attributes":"{Name:pause-20220921152522-3535 ClientURLs:[https://192.168.64.28:2379]}","request-path":"/0/members/d3378a43e4252963/attributes","cluster-id":"e703c3abd1a7846","publish-timeout":"7s"}
	{"level":"info","ts":"2022-09-21T22:26:49.367Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
	{"level":"info","ts":"2022-09-21T22:26:49.367Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
	{"level":"info","ts":"2022-09-21T22:26:49.368Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2022-09-21T22:26:49.370Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"192.168.64.28:2379"}
	{"level":"info","ts":"2022-09-21T22:26:49.375Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2022-09-21T22:26:49.376Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"info","ts":"2022-09-21T22:27:00.388Z","caller":"osutil/interrupt_unix.go:64","msg":"received signal; shutting down","signal":"terminated"}
	{"level":"info","ts":"2022-09-21T22:27:00.388Z","caller":"embed/etcd.go:368","msg":"closing etcd server","name":"pause-20220921152522-3535","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.64.28:2380"],"advertise-client-urls":["https://192.168.64.28:2379"]}
	WARNING: 2022/09/21 22:27:00 [core] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 127.0.0.1:2379 <nil> 0 <nil>}. Err: connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
	WARNING: 2022/09/21 22:27:00 [core] grpc: addrConn.createTransport failed to connect to {192.168.64.28:2379 192.168.64.28:2379 <nil> 0 <nil>}. Err: connection error: desc = "transport: Error while dialing dial tcp 192.168.64.28:2379: connect: connection refused". Reconnecting...
	{"level":"info","ts":"2022-09-21T22:27:00.391Z","caller":"etcdserver/server.go:1453","msg":"skipped leadership transfer for single voting member cluster","local-member-id":"d3378a43e4252963","current-leader-member-id":"d3378a43e4252963"}
	{"level":"info","ts":"2022-09-21T22:27:00.392Z","caller":"embed/etcd.go:563","msg":"stopping serving peer traffic","address":"192.168.64.28:2380"}
	{"level":"info","ts":"2022-09-21T22:27:00.394Z","caller":"embed/etcd.go:568","msg":"stopped serving peer traffic","address":"192.168.64.28:2380"}
	{"level":"info","ts":"2022-09-21T22:27:00.394Z","caller":"embed/etcd.go:370","msg":"closed etcd server","name":"pause-20220921152522-3535","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.64.28:2380"],"advertise-client-urls":["https://192.168.64.28:2379"]}
	
	* 
	* ==> etcd [64651e97bf14] <==
	* {"level":"info","ts":"2022-09-21T22:27:08.280Z","caller":"etcdserver/server.go:851","msg":"starting etcd server","local-member-id":"d3378a43e4252963","local-server-version":"3.5.4","cluster-version":"to_be_decided"}
	{"level":"info","ts":"2022-09-21T22:27:08.282Z","caller":"embed/etcd.go:688","msg":"starting with client TLS","tls-info":"cert = /var/lib/minikube/certs/etcd/server.crt, key = /var/lib/minikube/certs/etcd/server.key, client-cert=, client-key=, trusted-ca = /var/lib/minikube/certs/etcd/ca.crt, client-cert-auth = true, crl-file = ","cipher-suites":[]}
	{"level":"info","ts":"2022-09-21T22:27:08.283Z","caller":"embed/etcd.go:277","msg":"now serving peer/client/metrics","local-member-id":"d3378a43e4252963","initial-advertise-peer-urls":["https://192.168.64.28:2380"],"listen-peer-urls":["https://192.168.64.28:2380"],"advertise-client-urls":["https://192.168.64.28:2379"],"listen-client-urls":["https://127.0.0.1:2379","https://192.168.64.28:2379"],"listen-metrics-urls":["http://127.0.0.1:2381"]}
	{"level":"info","ts":"2022-09-21T22:27:08.282Z","caller":"etcdserver/server.go:752","msg":"starting initial election tick advance","election-ticks":10}
	{"level":"info","ts":"2022-09-21T22:27:08.283Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"d3378a43e4252963 switched to configuration voters=(15219785489916963171)"}
	{"level":"info","ts":"2022-09-21T22:27:08.283Z","caller":"membership/cluster.go:421","msg":"added member","cluster-id":"e703c3abd1a7846","local-member-id":"d3378a43e4252963","added-peer-id":"d3378a43e4252963","added-peer-peer-urls":["https://192.168.64.28:2380"]}
	{"level":"info","ts":"2022-09-21T22:27:08.283Z","caller":"membership/cluster.go:584","msg":"set initial cluster version","cluster-id":"e703c3abd1a7846","local-member-id":"d3378a43e4252963","cluster-version":"3.5"}
	{"level":"info","ts":"2022-09-21T22:27:08.283Z","caller":"api/capability.go:75","msg":"enabled capabilities for version","cluster-version":"3.5"}
	{"level":"info","ts":"2022-09-21T22:27:08.283Z","caller":"embed/etcd.go:581","msg":"serving peer traffic","address":"192.168.64.28:2380"}
	{"level":"info","ts":"2022-09-21T22:27:08.285Z","caller":"embed/etcd.go:553","msg":"cmux::serve","address":"192.168.64.28:2380"}
	{"level":"info","ts":"2022-09-21T22:27:08.283Z","caller":"embed/etcd.go:763","msg":"serving metrics","address":"http://127.0.0.1:2381"}
	{"level":"info","ts":"2022-09-21T22:27:09.547Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"d3378a43e4252963 is starting a new election at term 4"}
	{"level":"info","ts":"2022-09-21T22:27:09.547Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"d3378a43e4252963 became pre-candidate at term 4"}
	{"level":"info","ts":"2022-09-21T22:27:09.547Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"d3378a43e4252963 received MsgPreVoteResp from d3378a43e4252963 at term 4"}
	{"level":"info","ts":"2022-09-21T22:27:09.547Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"d3378a43e4252963 became candidate at term 5"}
	{"level":"info","ts":"2022-09-21T22:27:09.547Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"d3378a43e4252963 received MsgVoteResp from d3378a43e4252963 at term 5"}
	{"level":"info","ts":"2022-09-21T22:27:09.547Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"d3378a43e4252963 became leader at term 5"}
	{"level":"info","ts":"2022-09-21T22:27:09.547Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: d3378a43e4252963 elected leader d3378a43e4252963 at term 5"}
	{"level":"info","ts":"2022-09-21T22:27:09.547Z","caller":"etcdserver/server.go:2042","msg":"published local member to cluster through raft","local-member-id":"d3378a43e4252963","local-member-attributes":"{Name:pause-20220921152522-3535 ClientURLs:[https://192.168.64.28:2379]}","request-path":"/0/members/d3378a43e4252963/attributes","cluster-id":"e703c3abd1a7846","publish-timeout":"7s"}
	{"level":"info","ts":"2022-09-21T22:27:09.548Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
	{"level":"info","ts":"2022-09-21T22:27:09.548Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"192.168.64.28:2379"}
	{"level":"info","ts":"2022-09-21T22:27:09.549Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
	{"level":"info","ts":"2022-09-21T22:27:09.549Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2022-09-21T22:27:09.550Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2022-09-21T22:27:09.550Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	
	* 
	* ==> kernel <==
	*  22:27:34 up 2 min,  0 users,  load average: 0.36, 0.20, 0.08
	Linux pause-20220921152522-3535 5.10.57 #1 SMP Sat Sep 10 02:24:46 UTC 2022 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2021.02.12"
	
	* 
	* ==> kube-apiserver [3a4741e1fe3c] <==
	* W0921 22:26:42.249889       1 logging.go:59] [core] [Channel #3 SubChannel #5] grpc: addrConn.createTransport failed to connect to {
	  "Addr": "127.0.0.1:2379",
	  "ServerName": "127.0.0.1",
	  "Attributes": null,
	  "BalancerAttributes": null,
	  "Type": 0,
	  "Metadata": null
	}. Err: connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused"
	W0921 22:26:42.252491       1 logging.go:59] [core] [Channel #4 SubChannel #6] grpc: addrConn.createTransport failed to connect to {
	  "Addr": "127.0.0.1:2379",
	  "ServerName": "127.0.0.1",
	  "Attributes": null,
	  "BalancerAttributes": null,
	  "Type": 0,
	  "Metadata": null
	}. Err: connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused"
	W0921 22:26:47.844900       1 logging.go:59] [core] [Channel #1 SubChannel #2] grpc: addrConn.createTransport failed to connect to {
	  "Addr": "127.0.0.1:2379",
	  "ServerName": "127.0.0.1",
	  "Attributes": null,
	  "BalancerAttributes": null,
	  "Type": 0,
	  "Metadata": null
	}. Err: connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused"
	E0921 22:26:51.410448       1 run.go:74] "command failed" err="context deadline exceeded"
	
	* 
	* ==> kube-apiserver [b6d4531497f3] <==
	* I0921 22:27:14.062878       1 controller.go:85] Starting OpenAPI controller
	I0921 22:27:14.063014       1 controller.go:85] Starting OpenAPI V3 controller
	I0921 22:27:14.063120       1 naming_controller.go:291] Starting NamingConditionController
	I0921 22:27:14.063157       1 establishing_controller.go:76] Starting EstablishingController
	I0921 22:27:14.063169       1 nonstructuralschema_controller.go:192] Starting NonStructuralSchemaConditionController
	I0921 22:27:14.063271       1 apiapproval_controller.go:186] Starting KubernetesAPIApprovalPolicyConformantConditionController
	I0921 22:27:14.063303       1 crd_finalizer.go:266] Starting CRDFinalizer
	I0921 22:27:14.071305       1 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	I0921 22:27:14.072396       1 dynamic_cafile_content.go:157] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I0921 22:27:14.156918       1 cache.go:39] Caches are synced for autoregister controller
	I0921 22:27:14.157381       1 cache.go:39] Caches are synced for APIServiceRegistrationController controller
	I0921 22:27:14.159134       1 cache.go:39] Caches are synced for AvailableConditionController controller
	I0921 22:27:14.160295       1 shared_informer.go:262] Caches are synced for cluster_authentication_trust_controller
	I0921 22:27:14.162748       1 apf_controller.go:305] Running API Priority and Fairness config worker
	I0921 22:27:14.164291       1 shared_informer.go:262] Caches are synced for crd-autoregister
	I0921 22:27:14.214291       1 shared_informer.go:262] Caches are synced for node_authorizer
	I0921 22:27:14.252859       1 controller.go:616] quota admission added evaluator for: leases.coordination.k8s.io
	I0921 22:27:14.849364       1 controller.go:132] OpenAPI AggregationController: action for item k8s_internal_local_delegation_chain_0000000000: Nothing (removed from the queue).
	I0921 22:27:15.061773       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I0921 22:27:15.487959       1 controller.go:616] quota admission added evaluator for: serviceaccounts
	I0921 22:27:15.496083       1 controller.go:616] quota admission added evaluator for: deployments.apps
	I0921 22:27:15.512729       1 controller.go:616] quota admission added evaluator for: daemonsets.apps
	I0921 22:27:15.525104       1 controller.go:616] quota admission added evaluator for: roles.rbac.authorization.k8s.io
	I0921 22:27:15.528873       1 controller.go:616] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I0921 22:27:26.810346       1 controller.go:616] quota admission added evaluator for: endpoints
	
	* 
	* ==> kube-controller-manager [534b0d7cd88d] <==
	* I0921 22:27:27.091965       1 node_lifecycle_controller.go:1443] Initializing eviction metric for zone: 
	W0921 22:27:27.092105       1 node_lifecycle_controller.go:1058] Missing timestamp for Node pause-20220921152522-3535. Assuming now as a timestamp.
	I0921 22:27:27.092144       1 node_lifecycle_controller.go:1259] Controller detected that zone  is now in state Normal.
	I0921 22:27:27.092272       1 event.go:294] "Event occurred" object="pause-20220921152522-3535" fieldPath="" kind="Node" apiVersion="v1" type="Normal" reason="RegisteredNode" message="Node pause-20220921152522-3535 event: Registered Node pause-20220921152522-3535 in Controller"
	I0921 22:27:27.110604       1 shared_informer.go:262] Caches are synced for TTL
	I0921 22:27:27.111981       1 shared_informer.go:262] Caches are synced for ReplicaSet
	I0921 22:27:27.112202       1 shared_informer.go:262] Caches are synced for HPA
	I0921 22:27:27.112592       1 shared_informer.go:262] Caches are synced for TTL after finished
	I0921 22:27:27.115223       1 shared_informer.go:262] Caches are synced for namespace
	I0921 22:27:27.118788       1 shared_informer.go:262] Caches are synced for job
	I0921 22:27:27.122949       1 shared_informer.go:262] Caches are synced for cronjob
	I0921 22:27:27.126944       1 shared_informer.go:262] Caches are synced for endpoint
	I0921 22:27:27.160485       1 shared_informer.go:262] Caches are synced for expand
	I0921 22:27:27.173668       1 shared_informer.go:262] Caches are synced for persistent volume
	I0921 22:27:27.175944       1 shared_informer.go:262] Caches are synced for endpoint_slice_mirroring
	I0921 22:27:27.203878       1 shared_informer.go:262] Caches are synced for attach detach
	I0921 22:27:27.211345       1 shared_informer.go:262] Caches are synced for PV protection
	I0921 22:27:27.216091       1 shared_informer.go:262] Caches are synced for resource quota
	I0921 22:27:27.220621       1 shared_informer.go:262] Caches are synced for stateful set
	I0921 22:27:27.261055       1 shared_informer.go:262] Caches are synced for endpoint_slice
	I0921 22:27:27.269364       1 shared_informer.go:262] Caches are synced for resource quota
	I0921 22:27:27.311010       1 shared_informer.go:262] Caches are synced for daemon sets
	I0921 22:27:27.654916       1 shared_informer.go:262] Caches are synced for garbage collector
	I0921 22:27:27.686746       1 shared_informer.go:262] Caches are synced for garbage collector
	I0921 22:27:27.686841       1 garbagecollector.go:163] Garbage collector: all resource monitors have synced. Proceeding to collect garbage
	
	* 
	* ==> kube-controller-manager [823942ffecb6] <==
	* I0921 22:26:49.430074       1 serving.go:348] Generated self-signed cert in-memory
	I0921 22:26:50.068771       1 controllermanager.go:178] Version: v1.25.2
	I0921 22:26:50.068811       1 controllermanager.go:180] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0921 22:26:50.069610       1 secure_serving.go:210] Serving securely on 127.0.0.1:10257
	I0921 22:26:50.069706       1 tlsconfig.go:240] "Starting DynamicServingCertificateController"
	I0921 22:26:50.069775       1 dynamic_cafile_content.go:157] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I0921 22:26:50.070146       1 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	
	* 
	* ==> kube-proxy [152338a53f1e] <==
	* I0921 22:27:16.200105       1 node.go:163] Successfully retrieved node IP: 192.168.64.28
	I0921 22:27:16.200255       1 server_others.go:138] "Detected node IP" address="192.168.64.28"
	I0921 22:27:16.200284       1 server_others.go:578] "Unknown proxy mode, assuming iptables proxy" proxyMode=""
	I0921 22:27:16.220796       1 server_others.go:199] "kube-proxy running in single-stack mode, this ipFamily is not supported" ipFamily=IPv6
	I0921 22:27:16.220810       1 server_others.go:206] "Using iptables Proxier"
	I0921 22:27:16.220829       1 proxier.go:262] "Setting route_localnet=1, use nodePortAddresses to filter loopback addresses for NodePorts to skip it https://issues.k8s.io/90259"
	I0921 22:27:16.221038       1 server.go:661] "Version info" version="v1.25.2"
	I0921 22:27:16.221047       1 server.go:663] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0921 22:27:16.221421       1 config.go:317] "Starting service config controller"
	I0921 22:27:16.221427       1 shared_informer.go:255] Waiting for caches to sync for service config
	I0921 22:27:16.221438       1 config.go:226] "Starting endpoint slice config controller"
	I0921 22:27:16.221440       1 shared_informer.go:255] Waiting for caches to sync for endpoint slice config
	I0921 22:27:16.221790       1 config.go:444] "Starting node config controller"
	I0921 22:27:16.221831       1 shared_informer.go:255] Waiting for caches to sync for node config
	I0921 22:27:16.321553       1 shared_informer.go:262] Caches are synced for endpoint slice config
	I0921 22:27:16.321868       1 shared_informer.go:262] Caches are synced for service config
	I0921 22:27:16.322427       1 shared_informer.go:262] Caches are synced for node config
	
	* 
	* ==> kube-proxy [c2e8fe8419a9] <==
	* E0921 22:26:52.417919       1 node.go:152] Failed to retrieve node info: Get "https://control-plane.minikube.internal:8443/api/v1/nodes/pause-20220921152522-3535": dial tcp 192.168.64.28:8443: connect: connection refused - error from a previous attempt: read tcp 192.168.64.28:45762->192.168.64.28:8443: read: connection reset by peer
	E0921 22:26:53.525473       1 node.go:152] Failed to retrieve node info: Get "https://control-plane.minikube.internal:8443/api/v1/nodes/pause-20220921152522-3535": dial tcp 192.168.64.28:8443: connect: connection refused
	E0921 22:26:55.541635       1 node.go:152] Failed to retrieve node info: Get "https://control-plane.minikube.internal:8443/api/v1/nodes/pause-20220921152522-3535": dial tcp 192.168.64.28:8443: connect: connection refused
	E0921 22:27:00.072196       1 node.go:152] Failed to retrieve node info: Get "https://control-plane.minikube.internal:8443/api/v1/nodes/pause-20220921152522-3535": dial tcp 192.168.64.28:8443: connect: connection refused
	
	* 
	* ==> kube-scheduler [207eee071672] <==
	* I0921 22:27:07.942128       1 serving.go:348] Generated self-signed cert in-memory
	W0921 22:27:14.136528       1 requestheader_controller.go:193] Unable to get configmap/extension-apiserver-authentication in kube-system.  Usually fixed by 'kubectl create rolebinding -n kube-system ROLEBINDING_NAME --role=extension-apiserver-authentication-reader --serviceaccount=YOUR_NS:YOUR_SA'
	W0921 22:27:14.136587       1 authentication.go:346] Error looking up in-cluster authentication configuration: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot get resource "configmaps" in API group "" in the namespace "kube-system"
	W0921 22:27:14.136596       1 authentication.go:347] Continuing without authentication configuration. This may treat all requests as anonymous.
	W0921 22:27:14.136622       1 authentication.go:348] To require authentication configuration lookup to succeed, set --authentication-tolerate-lookup-failure=false
	I0921 22:27:14.160522       1 server.go:148] "Starting Kubernetes Scheduler" version="v1.25.2"
	I0921 22:27:14.160612       1 server.go:150] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0921 22:27:14.161435       1 secure_serving.go:210] Serving securely on 127.0.0.1:10259
	I0921 22:27:14.161580       1 configmap_cafile_content.go:202] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I0921 22:27:14.163051       1 shared_informer.go:255] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	I0921 22:27:14.161599       1 tlsconfig.go:240] "Starting DynamicServingCertificateController"
	I0921 22:27:14.263724       1 shared_informer.go:262] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	* 
	* ==> kube-scheduler [d7cbc4c453b0] <==
	* W0921 22:26:56.662066       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.CSINode: Get "https://192.168.64.28:8443/apis/storage.k8s.io/v1/csinodes?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	E0921 22:26:56.662326       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.CSINode: failed to list *v1.CSINode: Get "https://192.168.64.28:8443/apis/storage.k8s.io/v1/csinodes?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	W0921 22:26:56.676873       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.ReplicaSet: Get "https://192.168.64.28:8443/apis/apps/v1/replicasets?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	E0921 22:26:56.677417       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: Get "https://192.168.64.28:8443/apis/apps/v1/replicasets?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	W0921 22:26:56.727262       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Pod: Get "https://192.168.64.28:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	E0921 22:26:56.727389       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Pod: failed to list *v1.Pod: Get "https://192.168.64.28:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	W0921 22:26:56.792874       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.PersistentVolumeClaim: Get "https://192.168.64.28:8443/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	E0921 22:26:56.792933       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: Get "https://192.168.64.28:8443/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	W0921 22:26:57.019135       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.ReplicationController: Get "https://192.168.64.28:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	E0921 22:26:57.019287       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: Get "https://192.168.64.28:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	W0921 22:26:57.111170       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.PodDisruptionBudget: Get "https://192.168.64.28:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	E0921 22:26:57.111256       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PodDisruptionBudget: failed to list *v1.PodDisruptionBudget: Get "https://192.168.64.28:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	W0921 22:26:59.563534       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Node: Get "https://192.168.64.28:8443/api/v1/nodes?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	E0921 22:26:59.563559       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://192.168.64.28:8443/api/v1/nodes?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	W0921 22:26:59.965353       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.ReplicaSet: Get "https://192.168.64.28:8443/apis/apps/v1/replicasets?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	E0921 22:26:59.965379       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: Get "https://192.168.64.28:8443/apis/apps/v1/replicasets?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	W0921 22:27:00.044825       1 reflector.go:424] pkg/server/dynamiccertificates/configmap_cafile_content.go:206: failed to list *v1.ConfigMap: Get "https://192.168.64.28:8443/api/v1/namespaces/kube-system/configmaps?fieldSelector=metadata.name%!D(MISSING)extension-apiserver-authentication&limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	E0921 22:27:00.044871       1 reflector.go:140] pkg/server/dynamiccertificates/configmap_cafile_content.go:206: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get "https://192.168.64.28:8443/api/v1/namespaces/kube-system/configmaps?fieldSelector=metadata.name%!D(MISSING)extension-apiserver-authentication&limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	W0921 22:27:00.384285       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.ReplicationController: Get "https://192.168.64.28:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	E0921 22:27:00.384326       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: Get "https://192.168.64.28:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.168.64.28:8443: connect: connection refused
	E0921 22:27:00.398528       1 shared_informer.go:258] unable to sync caches for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	I0921 22:27:00.398546       1 configmap_cafile_content.go:210] "Shutting down controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I0921 22:27:00.398572       1 secure_serving.go:255] Stopped listening on 127.0.0.1:10259
	I0921 22:27:00.398622       1 tlsconfig.go:255] "Shutting down DynamicServingCertificateController"
	E0921 22:27:00.398861       1 run.go:74] "command failed" err="finished without leader elect"
	
	* 
	* ==> kubelet <==
	* -- Journal begins at Wed 2022-09-21 22:25:29 UTC, ends at Wed 2022-09-21 22:27:35 UTC. --
	Sep 21 22:27:13 pause-20220921152522-3535 kubelet[5836]: E0921 22:27:13.739144    5836 kubelet.go:2448] "Error getting node" err="node \"pause-20220921152522-3535\" not found"
	Sep 21 22:27:13 pause-20220921152522-3535 kubelet[5836]: E0921 22:27:13.839713    5836 kubelet.go:2448] "Error getting node" err="node \"pause-20220921152522-3535\" not found"
	Sep 21 22:27:13 pause-20220921152522-3535 kubelet[5836]: E0921 22:27:13.940319    5836 kubelet.go:2448] "Error getting node" err="node \"pause-20220921152522-3535\" not found"
	Sep 21 22:27:14 pause-20220921152522-3535 kubelet[5836]: E0921 22:27:14.040786    5836 kubelet.go:2448] "Error getting node" err="node \"pause-20220921152522-3535\" not found"
	Sep 21 22:27:14 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:14.141509    5836 kuberuntime_manager.go:1050] "Updating runtime config through cri with podcidr" CIDR="10.244.0.0/24"
	Sep 21 22:27:14 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:14.142001    5836 kubelet_network.go:60] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="10.244.0.0/24"
	Sep 21 22:27:14 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:14.235105    5836 kubelet_node_status.go:108] "Node was previously registered" node="pause-20220921152522-3535"
	Sep 21 22:27:14 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:14.235257    5836 kubelet_node_status.go:73] "Successfully registered node" node="pause-20220921152522-3535"
	Sep 21 22:27:14 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:14.845723    5836 apiserver.go:52] "Watching apiserver"
	Sep 21 22:27:14 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:14.847588    5836 topology_manager.go:205] "Topology Admit Handler"
	Sep 21 22:27:14 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:14.847682    5836 topology_manager.go:205] "Topology Admit Handler"
	Sep 21 22:27:14 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:14.951602    5836 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/1c5b06ea-f4c2-45b9-a80e-d85983bb3282-kube-proxy\") pod \"kube-proxy-5c7jc\" (UID: \"1c5b06ea-f4c2-45b9-a80e-d85983bb3282\") " pod="kube-system/kube-proxy-5c7jc"
	Sep 21 22:27:14 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:14.951731    5836 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c5b06ea-f4c2-45b9-a80e-d85983bb3282-lib-modules\") pod \"kube-proxy-5c7jc\" (UID: \"1c5b06ea-f4c2-45b9-a80e-d85983bb3282\") " pod="kube-system/kube-proxy-5c7jc"
	Sep 21 22:27:14 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:14.951776    5836 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/1c5b06ea-f4c2-45b9-a80e-d85983bb3282-xtables-lock\") pod \"kube-proxy-5c7jc\" (UID: \"1c5b06ea-f4c2-45b9-a80e-d85983bb3282\") " pod="kube-system/kube-proxy-5c7jc"
	Sep 21 22:27:14 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:14.951850    5836 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb8f3bae-6107-4a2b-ba32-d79405830bf0-config-volume\") pod \"coredns-565d847f94-9wtnp\" (UID: \"eb8f3bae-6107-4a2b-ba32-d79405830bf0\") " pod="kube-system/coredns-565d847f94-9wtnp"
	Sep 21 22:27:14 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:14.951882    5836 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2kwd\" (UniqueName: \"kubernetes.io/projected/eb8f3bae-6107-4a2b-ba32-d79405830bf0-kube-api-access-p2kwd\") pod \"coredns-565d847f94-9wtnp\" (UID: \"eb8f3bae-6107-4a2b-ba32-d79405830bf0\") " pod="kube-system/coredns-565d847f94-9wtnp"
	Sep 21 22:27:14 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:14.951915    5836 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh2rf\" (UniqueName: \"kubernetes.io/projected/1c5b06ea-f4c2-45b9-a80e-d85983bb3282-kube-api-access-zh2rf\") pod \"kube-proxy-5c7jc\" (UID: \"1c5b06ea-f4c2-45b9-a80e-d85983bb3282\") " pod="kube-system/kube-proxy-5c7jc"
	Sep 21 22:27:14 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:14.951971    5836 reconciler.go:169] "Reconciler: start to sync state"
	Sep 21 22:27:15 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:15.748097    5836 scope.go:115] "RemoveContainer" containerID="4934b6e15931f96c8cd7409c9d9d107463001d3dbbe402bc7ecacd045cfdf26e"
	Sep 21 22:27:16 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:16.049291    5836 scope.go:115] "RemoveContainer" containerID="c2e8fe8419a96380dd14dec68931ed3399dbf26a6ff33aace75ae52a339d8568"
	Sep 21 22:27:23 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:23.685529    5836 prober_manager.go:287] "Failed to trigger a manual run" probe="Readiness"
	Sep 21 22:27:26 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:26.821517    5836 topology_manager.go:205] "Topology Admit Handler"
	Sep 21 22:27:26 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:26.979546    5836 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/host-path/f71f00f0-f421-45c2-bfe4-c1e99f11b8e5-tmp\") pod \"storage-provisioner\" (UID: \"f71f00f0-f421-45c2-bfe4-c1e99f11b8e5\") " pod="kube-system/storage-provisioner"
	Sep 21 22:27:26 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:26.979717    5836 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv2k8\" (UniqueName: \"kubernetes.io/projected/f71f00f0-f421-45c2-bfe4-c1e99f11b8e5-kube-api-access-tv2k8\") pod \"storage-provisioner\" (UID: \"f71f00f0-f421-45c2-bfe4-c1e99f11b8e5\") " pod="kube-system/storage-provisioner"
	Sep 21 22:27:27 pause-20220921152522-3535 kubelet[5836]: I0921 22:27:27.456744    5836 pod_container_deletor.go:79] "Container not found in pod's containers" containerID="c41fc7d463dbce833eb22fe2cbe7272c863767af9f5ce4eb37b36c8efa33b012"
	
	* 
	* ==> storage-provisioner [e6a3aeef0ff7] <==
	* I0921 22:27:27.575776       1 storage_provisioner.go:116] Initializing the minikube storage provisioner...
	I0921 22:27:27.585007       1 storage_provisioner.go:141] Storage provisioner initialized, now starting service!
	I0921 22:27:27.585247       1 leaderelection.go:243] attempting to acquire leader lease kube-system/k8s.io-minikube-hostpath...
	I0921 22:27:27.589937       1 leaderelection.go:253] successfully acquired lease kube-system/k8s.io-minikube-hostpath
	I0921 22:27:27.590215       1 controller.go:835] Starting provisioner controller k8s.io/minikube-hostpath_pause-20220921152522-3535_c99c674d-e74f-4876-b9bc-cca2318207c1!
	I0921 22:27:27.591354       1 event.go:282] Event(v1.ObjectReference{Kind:"Endpoints", Namespace:"kube-system", Name:"k8s.io-minikube-hostpath", UID:"cea77369-71af-4aec-8a4d-59cc48396b09", APIVersion:"v1", ResourceVersion:"467", FieldPath:""}): type: 'Normal' reason: 'LeaderElection' pause-20220921152522-3535_c99c674d-e74f-4876-b9bc-cca2318207c1 became leader
	I0921 22:27:27.690985       1 controller.go:884] Started provisioner controller k8s.io/minikube-hostpath_pause-20220921152522-3535_c99c674d-e74f-4876-b9bc-cca2318207c1!
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p pause-20220921152522-3535 -n pause-20220921152522-3535
helpers_test.go:261: (dbg) Run:  kubectl --context pause-20220921152522-3535 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:270: non-running pods: 
helpers_test.go:272: ======> post-mortem[TestPause/serial/SecondStartNoReconfiguration]: describe non-running pods <======
helpers_test.go:275: (dbg) Run:  kubectl --context pause-20220921152522-3535 describe pod 
helpers_test.go:275: (dbg) Non-zero exit: kubectl --context pause-20220921152522-3535 describe pod : exit status 1 (37.073474ms)

                                                
                                                
** stderr ** 
	error: resource name may not be empty

                                                
                                                
** /stderr **
helpers_test.go:277: kubectl --context pause-20220921152522-3535 describe pod : exit status 1
--- FAIL: TestPause/serial/SecondStartNoReconfiguration (79.73s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/HairPin (57.31s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/HairPin
net_test.go:238: (dbg) Run:  kubectl --context kubenet-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
net_test.go:238: (dbg) Non-zero exit: kubectl --context kubenet-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080": exit status 1 (5.096216155s)

                                                
                                                
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
net_test.go:238: (dbg) Run:  kubectl --context kubenet-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
net_test.go:238: (dbg) Non-zero exit: kubectl --context kubenet-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080": exit status 1 (5.114269782s)

                                                
                                                
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
net_test.go:238: (dbg) Run:  kubectl --context kubenet-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
net_test.go:238: (dbg) Non-zero exit: kubectl --context kubenet-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080": exit status 1 (5.102621305s)

                                                
                                                
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
E0921 15:32:19.703718    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
net_test.go:238: (dbg) Run:  kubectl --context kubenet-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
net_test.go:238: (dbg) Non-zero exit: kubectl --context kubenet-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080": exit status 1 (5.097750452s)

                                                
                                                
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
net_test.go:238: (dbg) Run:  kubectl --context kubenet-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
net_test.go:238: (dbg) Non-zero exit: kubectl --context kubenet-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080": exit status 1 (5.098611328s)

                                                
                                                
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
E0921 15:32:36.654898    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
E0921 15:32:40.203305    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/client.crt: no such file or directory
E0921 15:32:40.208452    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/client.crt: no such file or directory
E0921 15:32:40.218900    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/client.crt: no such file or directory
E0921 15:32:40.239143    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/client.crt: no such file or directory
E0921 15:32:40.279736    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/client.crt: no such file or directory
E0921 15:32:40.360089    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/client.crt: no such file or directory
E0921 15:32:40.522293    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/client.crt: no such file or directory
E0921 15:32:40.843107    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/client.crt: no such file or directory
net_test.go:238: (dbg) Run:  kubectl --context kubenet-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
E0921 15:32:41.485025    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/client.crt: no such file or directory
E0921 15:32:42.765328    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/client.crt: no such file or directory
E0921 15:32:45.326126    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/client.crt: no such file or directory
net_test.go:238: (dbg) Non-zero exit: kubectl --context kubenet-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080": exit status 1 (5.104980509s)

                                                
                                                
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
E0921 15:32:50.447803    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/client.crt: no such file or directory
net_test.go:238: (dbg) Run:  kubectl --context kubenet-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
net_test.go:238: (dbg) Non-zero exit: kubectl --context kubenet-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080": exit status 1 (5.10382812s)

                                                
                                                
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
net_test.go:243: failed to connect via pod host: exit status 1
--- FAIL: TestNetworkPlugins/group/kubenet/HairPin (57.31s)

                                                
                                    

Test pass (281/299)

Order passed test Duration
3 TestDownloadOnly/v1.16.0/json-events 9.66
4 TestDownloadOnly/v1.16.0/preload-exists 0
7 TestDownloadOnly/v1.16.0/kubectl 0
8 TestDownloadOnly/v1.16.0/LogsDuration 0.29
10 TestDownloadOnly/v1.25.2/json-events 6.61
11 TestDownloadOnly/v1.25.2/preload-exists 0
14 TestDownloadOnly/v1.25.2/kubectl 0
15 TestDownloadOnly/v1.25.2/LogsDuration 0.33
16 TestDownloadOnly/DeleteAll 0.44
17 TestDownloadOnly/DeleteAlwaysSucceeds 0.42
19 TestBinaryMirror 1.05
20 TestOffline 61.43
22 TestAddons/Setup 144.85
24 TestAddons/parallel/Registry 18.52
25 TestAddons/parallel/Ingress 23.71
26 TestAddons/parallel/MetricsServer 5.43
27 TestAddons/parallel/HelmTiller 11.36
29 TestAddons/parallel/CSI 47.21
30 TestAddons/parallel/Headlamp 11.14
32 TestAddons/serial/GCPAuth 19.37
33 TestAddons/StoppedEnableDisable 3.56
34 TestCertOptions 40.76
35 TestCertExpiration 251.37
36 TestDockerFlags 58.39
37 TestForceSystemdFlag 44.69
38 TestForceSystemdEnv 43.01
40 TestHyperKitDriverInstallOrUpdate 6.71
43 TestErrorSpam/setup 52.58
44 TestErrorSpam/start 1.2
45 TestErrorSpam/status 0.45
46 TestErrorSpam/pause 1.26
47 TestErrorSpam/unpause 1.31
48 TestErrorSpam/stop 3.64
51 TestFunctional/serial/CopySyncFile 0
52 TestFunctional/serial/StartWithProxy 64.5
53 TestFunctional/serial/AuditLog 0
54 TestFunctional/serial/SoftStart 52.42
55 TestFunctional/serial/KubeContext 0.03
56 TestFunctional/serial/KubectlGetPods 0.05
59 TestFunctional/serial/CacheCmd/cache/add_remote 10.39
60 TestFunctional/serial/CacheCmd/cache/add_local 1.59
61 TestFunctional/serial/CacheCmd/cache/delete_k8s.gcr.io/pause:3.3 0.07
62 TestFunctional/serial/CacheCmd/cache/list 0.07
63 TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node 0.16
64 TestFunctional/serial/CacheCmd/cache/cache_reload 2.3
65 TestFunctional/serial/CacheCmd/cache/delete 0.15
66 TestFunctional/serial/MinikubeKubectlCmd 0.49
67 TestFunctional/serial/MinikubeKubectlCmdDirectly 0.64
68 TestFunctional/serial/ExtraConfig 43.85
69 TestFunctional/serial/ComponentHealth 0.05
70 TestFunctional/serial/LogsCmd 2.59
71 TestFunctional/serial/LogsFileCmd 2.69
73 TestFunctional/parallel/ConfigCmd 0.5
74 TestFunctional/parallel/DashboardCmd 13.06
75 TestFunctional/parallel/DryRun 1.14
76 TestFunctional/parallel/InternationalLanguage 0.48
77 TestFunctional/parallel/StatusCmd 0.44
80 TestFunctional/parallel/ServiceCmd 10.18
81 TestFunctional/parallel/ServiceCmdConnect 13.34
82 TestFunctional/parallel/AddonsCmd 0.26
83 TestFunctional/parallel/PersistentVolumeClaim 27.03
85 TestFunctional/parallel/SSHCmd 0.27
86 TestFunctional/parallel/CpCmd 0.58
87 TestFunctional/parallel/MySQL 19.95
88 TestFunctional/parallel/FileSync 0.14
89 TestFunctional/parallel/CertSync 0.83
93 TestFunctional/parallel/NodeLabels 0.05
95 TestFunctional/parallel/NonActiveRuntimeDisabled 0.17
97 TestFunctional/parallel/Version/short 0.09
98 TestFunctional/parallel/Version/components 0.35
99 TestFunctional/parallel/ImageCommands/ImageListShort 0.16
100 TestFunctional/parallel/ImageCommands/ImageListTable 0.15
101 TestFunctional/parallel/ImageCommands/ImageListJson 0.19
102 TestFunctional/parallel/ImageCommands/ImageListYaml 0.16
103 TestFunctional/parallel/ImageCommands/ImageBuild 5.71
104 TestFunctional/parallel/ImageCommands/Setup 4.02
105 TestFunctional/parallel/DockerEnv/bash 0.63
106 TestFunctional/parallel/UpdateContextCmd/no_changes 0.19
107 TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster 0.18
108 TestFunctional/parallel/UpdateContextCmd/no_clusters 0.18
109 TestFunctional/parallel/ImageCommands/ImageLoadDaemon 3.7
110 TestFunctional/parallel/ImageCommands/ImageReloadDaemon 2.04
111 TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon 6.65
112 TestFunctional/parallel/ImageCommands/ImageSaveToFile 0.83
113 TestFunctional/parallel/ImageCommands/ImageRemove 0.32
114 TestFunctional/parallel/ImageCommands/ImageLoadFromFile 1.44
115 TestFunctional/parallel/ImageCommands/ImageSaveDaemon 2.26
116 TestFunctional/parallel/ProfileCmd/profile_not_create 0.41
118 TestFunctional/parallel/TunnelCmd/serial/StartTunnel 0.02
120 TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup 11.12
121 TestFunctional/parallel/ProfileCmd/profile_list 0.29
122 TestFunctional/parallel/ProfileCmd/profile_json_output 0.37
123 TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP 0.04
124 TestFunctional/parallel/TunnelCmd/serial/AccessDirect 0.01
125 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig 0.03
126 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil 0.02
127 TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS 0.05
128 TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel 0.13
129 TestFunctional/parallel/MountCmd/any-port 9.85
130 TestFunctional/parallel/MountCmd/specific-port 1.44
131 TestFunctional/delete_addon-resizer_images 0.16
132 TestFunctional/delete_my-image_image 0.08
133 TestFunctional/delete_minikube_cached_images 0.06
136 TestIngressAddonLegacy/StartLegacyK8sCluster 75.63
138 TestIngressAddonLegacy/serial/ValidateIngressAddonActivation 16.22
139 TestIngressAddonLegacy/serial/ValidateIngressDNSAddonActivation 0.54
140 TestIngressAddonLegacy/serial/ValidateIngressAddons 29.56
143 TestJSONOutput/start/Command 52.98
144 TestJSONOutput/start/Audit 0
146 TestJSONOutput/start/parallel/DistinctCurrentSteps 0
147 TestJSONOutput/start/parallel/IncreasingCurrentSteps 0
149 TestJSONOutput/pause/Command 0.5
150 TestJSONOutput/pause/Audit 0
152 TestJSONOutput/pause/parallel/DistinctCurrentSteps 0
153 TestJSONOutput/pause/parallel/IncreasingCurrentSteps 0
155 TestJSONOutput/unpause/Command 0.43
156 TestJSONOutput/unpause/Audit 0
158 TestJSONOutput/unpause/parallel/DistinctCurrentSteps 0
159 TestJSONOutput/unpause/parallel/IncreasingCurrentSteps 0
161 TestJSONOutput/stop/Command 8.15
162 TestJSONOutput/stop/Audit 0
164 TestJSONOutput/stop/parallel/DistinctCurrentSteps 0
165 TestJSONOutput/stop/parallel/IncreasingCurrentSteps 0
166 TestErrorJSONOutput 0.76
170 TestMainNoArgs 0.07
171 TestMinikubeProfile 90.65
174 TestMountStart/serial/StartWithMountFirst 15.56
175 TestMountStart/serial/VerifyMountFirst 0.27
176 TestMountStart/serial/StartWithMountSecond 16.84
177 TestMountStart/serial/VerifyMountSecond 0.27
178 TestMountStart/serial/DeleteFirst 2.36
179 TestMountStart/serial/VerifyMountPostDelete 0.27
180 TestMountStart/serial/Stop 2.23
181 TestMountStart/serial/RestartStopped 15.98
182 TestMountStart/serial/VerifyMountPostStop 0.28
185 TestMultiNode/serial/FreshStart2Nodes 120.8
186 TestMultiNode/serial/DeployApp2Nodes 7.71
187 TestMultiNode/serial/PingHostFrom2Pods 0.82
188 TestMultiNode/serial/AddNode 43.31
189 TestMultiNode/serial/ProfileList 0.26
190 TestMultiNode/serial/CopyFile 5.08
191 TestMultiNode/serial/StopNode 2.66
192 TestMultiNode/serial/StartAfterStop 33.46
193 TestMultiNode/serial/RestartKeepsNodes 910.6
194 TestMultiNode/serial/DeleteNode 4.95
195 TestMultiNode/serial/StopMultiNode 4.48
196 TestMultiNode/serial/RestartMultiNode 553.86
197 TestMultiNode/serial/ValidateNameConflict 47.36
201 TestPreload 172.24
203 TestScheduledStopUnix 112.59
204 TestSkaffold 73.14
207 TestRunningBinaryUpgrade 168.71
209 TestKubernetesUpgrade 139.64
222 TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current 4.07
223 TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current 6.04
231 TestStoppedBinaryUpgrade/Setup 0.65
232 TestStoppedBinaryUpgrade/Upgrade 172.45
233 TestStoppedBinaryUpgrade/MinikubeLogs 2.23
235 TestNoKubernetes/serial/StartNoK8sWithVersion 0.39
236 TestNoKubernetes/serial/StartWithK8s 41.86
237 TestNoKubernetes/serial/StartWithStopK8s 7.9
239 TestPause/serial/Start 54.31
240 TestNoKubernetes/serial/Start 21.9
241 TestNoKubernetes/serial/VerifyK8sNotRunning 0.12
242 TestNoKubernetes/serial/ProfileList 0.53
243 TestNoKubernetes/serial/Stop 2.21
244 TestNoKubernetes/serial/StartNoArgs 15.5
245 TestNoKubernetes/serial/VerifyK8sNotRunningSecond 0.11
246 TestNetworkPlugins/group/false/Start 91.45
248 TestNetworkPlugins/group/false/KubeletFlags 0.15
249 TestNetworkPlugins/group/false/NetCatPod 13.3
250 TestNetworkPlugins/group/auto/Start 54.48
251 TestNetworkPlugins/group/false/DNS 0.13
252 TestNetworkPlugins/group/false/Localhost 0.11
253 TestNetworkPlugins/group/false/HairPin 5.12
254 TestNetworkPlugins/group/kindnet/Start 62.38
255 TestNetworkPlugins/group/auto/KubeletFlags 0.15
256 TestNetworkPlugins/group/auto/NetCatPod 13.19
257 TestNetworkPlugins/group/auto/DNS 0.12
258 TestNetworkPlugins/group/auto/Localhost 0.09
259 TestNetworkPlugins/group/auto/HairPin 5.1
260 TestNetworkPlugins/group/flannel/Start 93.01
261 TestNetworkPlugins/group/kindnet/ControllerPod 5.01
262 TestNetworkPlugins/group/kindnet/KubeletFlags 0.15
263 TestNetworkPlugins/group/kindnet/NetCatPod 14.19
264 TestNetworkPlugins/group/kindnet/DNS 0.11
265 TestNetworkPlugins/group/kindnet/Localhost 0.1
266 TestNetworkPlugins/group/kindnet/HairPin 0.1
267 TestNetworkPlugins/group/enable-default-cni/Start 56.65
268 TestNetworkPlugins/group/enable-default-cni/KubeletFlags 0.21
269 TestNetworkPlugins/group/enable-default-cni/NetCatPod 13.19
270 TestNetworkPlugins/group/flannel/ControllerPod 5.01
271 TestNetworkPlugins/group/flannel/KubeletFlags 0.14
272 TestNetworkPlugins/group/flannel/NetCatPod 13.19
273 TestNetworkPlugins/group/enable-default-cni/DNS 0.11
274 TestNetworkPlugins/group/enable-default-cni/Localhost 0.1
275 TestNetworkPlugins/group/enable-default-cni/HairPin 0.11
276 TestNetworkPlugins/group/bridge/Start 53.56
277 TestNetworkPlugins/group/flannel/DNS 0.11
278 TestNetworkPlugins/group/flannel/Localhost 0.1
279 TestNetworkPlugins/group/flannel/HairPin 0.11
280 TestNetworkPlugins/group/kubenet/Start 51.46
281 TestNetworkPlugins/group/bridge/KubeletFlags 0.15
282 TestNetworkPlugins/group/bridge/NetCatPod 13.2
283 TestNetworkPlugins/group/kubenet/KubeletFlags 0.15
284 TestNetworkPlugins/group/kubenet/NetCatPod 14.17
285 TestNetworkPlugins/group/bridge/DNS 0.11
286 TestNetworkPlugins/group/bridge/Localhost 0.1
287 TestNetworkPlugins/group/bridge/HairPin 0.1
288 TestNetworkPlugins/group/calico/Start 309.28
289 TestNetworkPlugins/group/kubenet/DNS 0.12
290 TestNetworkPlugins/group/kubenet/Localhost 0.11
292 TestNetworkPlugins/group/cilium/Start 87.22
293 TestNetworkPlugins/group/cilium/ControllerPod 5.01
294 TestNetworkPlugins/group/cilium/KubeletFlags 0.15
295 TestNetworkPlugins/group/cilium/NetCatPod 15.64
296 TestNetworkPlugins/group/cilium/DNS 0.12
297 TestNetworkPlugins/group/cilium/Localhost 0.11
298 TestNetworkPlugins/group/cilium/HairPin 0.11
299 TestNetworkPlugins/group/custom-flannel/Start 60.48
300 TestNetworkPlugins/group/custom-flannel/KubeletFlags 0.14
301 TestNetworkPlugins/group/custom-flannel/NetCatPod 14.23
302 TestNetworkPlugins/group/custom-flannel/DNS 0.11
303 TestNetworkPlugins/group/custom-flannel/Localhost 0.1
304 TestNetworkPlugins/group/custom-flannel/HairPin 0.11
306 TestStartStop/group/old-k8s-version/serial/FirstStart 148.29
307 TestNetworkPlugins/group/calico/ControllerPod 5.01
308 TestNetworkPlugins/group/calico/KubeletFlags 0.15
309 TestNetworkPlugins/group/calico/NetCatPod 14.28
310 TestNetworkPlugins/group/calico/DNS 0.13
311 TestNetworkPlugins/group/calico/Localhost 0.11
312 TestNetworkPlugins/group/calico/HairPin 0.1
314 TestStartStop/group/no-preload/serial/FirstStart 63.56
315 TestStartStop/group/no-preload/serial/DeployApp 11.25
316 TestStartStop/group/no-preload/serial/EnableAddonWhileActive 0.65
317 TestStartStop/group/no-preload/serial/Stop 3.23
318 TestStartStop/group/old-k8s-version/serial/DeployApp 11.27
319 TestStartStop/group/no-preload/serial/EnableAddonAfterStop 0.27
320 TestStartStop/group/no-preload/serial/SecondStart 316.06
321 TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive 0.57
322 TestStartStop/group/old-k8s-version/serial/Stop 2.22
323 TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop 0.26
324 TestStartStop/group/old-k8s-version/serial/SecondStart 456.43
325 TestStartStop/group/no-preload/serial/UserAppExistsAfterStop 8.01
326 TestStartStop/group/no-preload/serial/AddonExistsAfterStop 5.06
327 TestStartStop/group/no-preload/serial/VerifyKubernetesImages 0.16
328 TestStartStop/group/no-preload/serial/Pause 1.86
330 TestStartStop/group/embed-certs/serial/FirstStart 63.01
331 TestStartStop/group/embed-certs/serial/DeployApp 12.31
332 TestStartStop/group/embed-certs/serial/EnableAddonWhileActive 0.69
333 TestStartStop/group/embed-certs/serial/Stop 3.28
334 TestStartStop/group/embed-certs/serial/EnableAddonAfterStop 0.27
335 TestStartStop/group/embed-certs/serial/SecondStart 313.04
336 TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop 5.01
337 TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop 5.06
338 TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages 0.17
339 TestStartStop/group/old-k8s-version/serial/Pause 1.78
341 TestStartStop/group/default-k8s-different-port/serial/FirstStart 58.2
342 TestStartStop/group/default-k8s-different-port/serial/DeployApp 12.27
343 TestStartStop/group/default-k8s-different-port/serial/EnableAddonWhileActive 0.63
344 TestStartStop/group/default-k8s-different-port/serial/Stop 8.24
345 TestStartStop/group/default-k8s-different-port/serial/EnableAddonAfterStop 0.3
346 TestStartStop/group/default-k8s-different-port/serial/SecondStart 311.48
347 TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop 12.01
348 TestStartStop/group/embed-certs/serial/AddonExistsAfterStop 5.06
349 TestStartStop/group/embed-certs/serial/VerifyKubernetesImages 0.17
350 TestStartStop/group/embed-certs/serial/Pause 1.84
352 TestStartStop/group/newest-cni/serial/FirstStart 51.29
353 TestStartStop/group/newest-cni/serial/DeployApp 0
354 TestStartStop/group/newest-cni/serial/EnableAddonWhileActive 0.77
355 TestStartStop/group/newest-cni/serial/Stop 8.24
356 TestStartStop/group/newest-cni/serial/EnableAddonAfterStop 0.27
357 TestStartStop/group/newest-cni/serial/SecondStart 31.7
358 TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop 0
359 TestStartStop/group/newest-cni/serial/AddonExistsAfterStop 0
360 TestStartStop/group/newest-cni/serial/VerifyKubernetesImages 0.2
361 TestStartStop/group/newest-cni/serial/Pause 1.84
362 TestStartStop/group/default-k8s-different-port/serial/UserAppExistsAfterStop 7.01
363 TestStartStop/group/default-k8s-different-port/serial/AddonExistsAfterStop 5.06
364 TestStartStop/group/default-k8s-different-port/serial/VerifyKubernetesImages 0.16
365 TestStartStop/group/default-k8s-different-port/serial/Pause 1.81
x
+
TestDownloadOnly/v1.16.0/json-events (9.66s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/json-events
aaa_download_only_test.go:71: (dbg) Run:  out/minikube-darwin-amd64 start -o=json --download-only -p download-only-20220921142630-3535 --force --alsologtostderr --kubernetes-version=v1.16.0 --container-runtime=docker --driver=hyperkit 
aaa_download_only_test.go:71: (dbg) Done: out/minikube-darwin-amd64 start -o=json --download-only -p download-only-20220921142630-3535 --force --alsologtostderr --kubernetes-version=v1.16.0 --container-runtime=docker --driver=hyperkit : (9.66389861s)
--- PASS: TestDownloadOnly/v1.16.0/json-events (9.66s)

                                                
                                    
x
+
TestDownloadOnly/v1.16.0/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/preload-exists
--- PASS: TestDownloadOnly/v1.16.0/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.16.0/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/kubectl
--- PASS: TestDownloadOnly/v1.16.0/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.16.0/LogsDuration (0.29s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/LogsDuration
aaa_download_only_test.go:173: (dbg) Run:  out/minikube-darwin-amd64 logs -p download-only-20220921142630-3535
aaa_download_only_test.go:173: (dbg) Non-zero exit: out/minikube-darwin-amd64 logs -p download-only-20220921142630-3535: exit status 85 (285.355189ms)

                                                
                                                
-- stdout --
	* 
	* ==> Audit <==
	* |---------|-----------------------------------|-----------------------------------|---------|---------|---------------------|----------|
	| Command |               Args                |              Profile              |  User   | Version |     Start Time      | End Time |
	|---------|-----------------------------------|-----------------------------------|---------|---------|---------------------|----------|
	| start   | -o=json --download-only -p        | download-only-20220921142630-3535 | jenkins | v1.27.0 | 21 Sep 22 14:26 PDT |          |
	|         | download-only-20220921142630-3535 |                                   |         |         |                     |          |
	|         | --force --alsologtostderr         |                                   |         |         |                     |          |
	|         | --kubernetes-version=v1.16.0      |                                   |         |         |                     |          |
	|         | --container-runtime=docker        |                                   |         |         |                     |          |
	|         | --driver=hyperkit                 |                                   |         |         |                     |          |
	|---------|-----------------------------------|-----------------------------------|---------|---------|---------------------|----------|
	
	* 
	* ==> Last Start <==
	* Log file created at: 2022/09/21 14:26:30
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.19.1 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0921 14:26:30.422032    3546 out.go:296] Setting OutFile to fd 1 ...
	I0921 14:26:30.422199    3546 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0921 14:26:30.422204    3546 out.go:309] Setting ErrFile to fd 2...
	I0921 14:26:30.422207    3546 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0921 14:26:30.422322    3546 root.go:333] Updating PATH: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/bin
	W0921 14:26:30.422429    3546 root.go:310] Error reading config file at /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/config/config.json: open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/config/config.json: no such file or directory
	I0921 14:26:30.423130    3546 out.go:303] Setting JSON to true
	I0921 14:26:30.438479    3546 start.go:115] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":1561,"bootTime":1663794029,"procs":340,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"12.6","kernelVersion":"21.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0921 14:26:30.438580    3546 start.go:123] gopshost.Virtualization returned error: not implemented yet
	I0921 14:26:30.460626    3546 out.go:97] [download-only-20220921142630-3535] minikube v1.27.0 on Darwin 12.6
	W0921 14:26:30.460768    3546 preload.go:295] Failed to list preload files: open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/cache/preloaded-tarball: no such file or directory
	I0921 14:26:30.460774    3546 notify.go:214] Checking for updates...
	I0921 14:26:30.482000    3546 out.go:169] MINIKUBE_LOCATION=14995
	I0921 14:26:30.526208    3546 out.go:169] KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/kubeconfig
	I0921 14:26:30.568218    3546 out.go:169] MINIKUBE_BIN=out/minikube-darwin-amd64
	I0921 14:26:30.610032    3546 out.go:169] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0921 14:26:30.631314    3546 out.go:169] MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube
	W0921 14:26:30.672986    3546 out.go:272] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I0921 14:26:30.673260    3546 driver.go:365] Setting default libvirt URI to qemu:///system
	I0921 14:26:30.700188    3546 out.go:97] Using the hyperkit driver based on user configuration
	I0921 14:26:30.700211    3546 start.go:284] selected driver: hyperkit
	I0921 14:26:30.700218    3546 start.go:808] validating driver "hyperkit" against <nil>
	I0921 14:26:30.700303    3546 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0921 14:26:30.700543    3546 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0921 14:26:30.835203    3546 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.27.0
	I0921 14:26:30.838833    3546 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 14:26:30.838851    3546 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0921 14:26:30.838891    3546 start_flags.go:302] no existing cluster config was found, will generate one from the flags 
	I0921 14:26:30.842849    3546 start_flags.go:383] Using suggested 6000MB memory alloc based on sys=32768MB, container=0MB
	I0921 14:26:30.842996    3546 start_flags.go:849] Wait components to verify : map[apiserver:true system_pods:true]
	I0921 14:26:30.843019    3546 cni.go:95] Creating CNI manager for ""
	I0921 14:26:30.843027    3546 cni.go:169] CNI unnecessary in this configuration, recommending no CNI
	I0921 14:26:30.843035    3546 start_flags.go:316] config:
	{Name:download-only-20220921142630-3535 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase:v0.0.34@sha256:f2a1e577e43fd6769f35cdb938f6d21c3dacfd763062d119cade738fa244720c Memory:6000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.16.0 ClusterName:download-only-20220921142630-3535 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Contain
erRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath:/opt/socket_vmnet/bin/socket_vmnet_client SocketVMnetPath:/var/run/socket_vmnet}
	I0921 14:26:30.843277    3546 iso.go:124] acquiring lock: {Name:mke8c57399926d29e846b47dd4be4625ba5fcaea Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0921 14:26:30.865068    3546 out.go:97] Downloading VM boot image ...
	I0921 14:26:30.865140    3546 download.go:101] Downloading: https://storage.googleapis.com/minikube/iso/minikube-v1.27.0-amd64.iso?checksum=file:https://storage.googleapis.com/minikube/iso/minikube-v1.27.0-amd64.iso.sha256 -> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/cache/iso/amd64/minikube-v1.27.0-amd64.iso
	I0921 14:26:34.305972    3546 out.go:97] Starting control plane node download-only-20220921142630-3535 in cluster download-only-20220921142630-3535
	I0921 14:26:34.305995    3546 preload.go:132] Checking if preload exists for k8s version v1.16.0 and runtime docker
	I0921 14:26:34.355522    3546 preload.go:119] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.16.0/preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4
	I0921 14:26:34.355544    3546 cache.go:57] Caching tarball of preloaded images
	I0921 14:26:34.355726    3546 preload.go:132] Checking if preload exists for k8s version v1.16.0 and runtime docker
	I0921 14:26:34.375688    3546 out.go:97] Downloading Kubernetes v1.16.0 preload ...
	I0921 14:26:34.375705    3546 preload.go:238] getting checksum for preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4 ...
	I0921 14:26:34.450100    3546 download.go:101] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.16.0/preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4?checksum=md5:326f3ce331abb64565b50b8c9e791244 -> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4
	
	* 
	* The control plane node "" does not exist.
	  To start a cluster, run: "minikube start -p download-only-20220921142630-3535"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:174: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.16.0/LogsDuration (0.29s)

                                                
                                    
x
+
TestDownloadOnly/v1.25.2/json-events (6.61s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.25.2/json-events
aaa_download_only_test.go:71: (dbg) Run:  out/minikube-darwin-amd64 start -o=json --download-only -p download-only-20220921142630-3535 --force --alsologtostderr --kubernetes-version=v1.25.2 --container-runtime=docker --driver=hyperkit 
aaa_download_only_test.go:71: (dbg) Done: out/minikube-darwin-amd64 start -o=json --download-only -p download-only-20220921142630-3535 --force --alsologtostderr --kubernetes-version=v1.25.2 --container-runtime=docker --driver=hyperkit : (6.611990903s)
--- PASS: TestDownloadOnly/v1.25.2/json-events (6.61s)

                                                
                                    
x
+
TestDownloadOnly/v1.25.2/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.25.2/preload-exists
--- PASS: TestDownloadOnly/v1.25.2/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.25.2/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.25.2/kubectl
--- PASS: TestDownloadOnly/v1.25.2/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.25.2/LogsDuration (0.33s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.25.2/LogsDuration
aaa_download_only_test.go:173: (dbg) Run:  out/minikube-darwin-amd64 logs -p download-only-20220921142630-3535
aaa_download_only_test.go:173: (dbg) Non-zero exit: out/minikube-darwin-amd64 logs -p download-only-20220921142630-3535: exit status 85 (327.124548ms)

                                                
                                                
-- stdout --
	* 
	* ==> Audit <==
	* |---------|-----------------------------------|-----------------------------------|---------|---------|---------------------|----------|
	| Command |               Args                |              Profile              |  User   | Version |     Start Time      | End Time |
	|---------|-----------------------------------|-----------------------------------|---------|---------|---------------------|----------|
	| start   | -o=json --download-only -p        | download-only-20220921142630-3535 | jenkins | v1.27.0 | 21 Sep 22 14:26 PDT |          |
	|         | download-only-20220921142630-3535 |                                   |         |         |                     |          |
	|         | --force --alsologtostderr         |                                   |         |         |                     |          |
	|         | --kubernetes-version=v1.16.0      |                                   |         |         |                     |          |
	|         | --container-runtime=docker        |                                   |         |         |                     |          |
	|         | --driver=hyperkit                 |                                   |         |         |                     |          |
	| start   | -o=json --download-only -p        | download-only-20220921142630-3535 | jenkins | v1.27.0 | 21 Sep 22 14:26 PDT |          |
	|         | download-only-20220921142630-3535 |                                   |         |         |                     |          |
	|         | --force --alsologtostderr         |                                   |         |         |                     |          |
	|         | --kubernetes-version=v1.25.2      |                                   |         |         |                     |          |
	|         | --container-runtime=docker        |                                   |         |         |                     |          |
	|         | --driver=hyperkit                 |                                   |         |         |                     |          |
	|---------|-----------------------------------|-----------------------------------|---------|---------|---------------------|----------|
	
	* 
	* ==> Last Start <==
	* Log file created at: 2022/09/21 14:26:40
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.19.1 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0921 14:26:40.371630    4020 out.go:296] Setting OutFile to fd 1 ...
	I0921 14:26:40.371785    4020 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0921 14:26:40.371790    4020 out.go:309] Setting ErrFile to fd 2...
	I0921 14:26:40.371794    4020 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0921 14:26:40.371889    4020 root.go:333] Updating PATH: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/bin
	W0921 14:26:40.371994    4020 root.go:310] Error reading config file at /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/config/config.json: open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/config/config.json: no such file or directory
	I0921 14:26:40.372331    4020 out.go:303] Setting JSON to true
	I0921 14:26:40.387254    4020 start.go:115] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":1571,"bootTime":1663794029,"procs":342,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"12.6","kernelVersion":"21.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0921 14:26:40.387352    4020 start.go:123] gopshost.Virtualization returned error: not implemented yet
	I0921 14:26:40.408470    4020 out.go:97] [download-only-20220921142630-3535] minikube v1.27.0 on Darwin 12.6
	I0921 14:26:40.408537    4020 notify.go:214] Checking for updates...
	I0921 14:26:40.429285    4020 out.go:169] MINIKUBE_LOCATION=14995
	I0921 14:26:40.450542    4020 out.go:169] KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/kubeconfig
	I0921 14:26:40.471521    4020 out.go:169] MINIKUBE_BIN=out/minikube-darwin-amd64
	I0921 14:26:40.492415    4020 out.go:169] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0921 14:26:40.513473    4020 out.go:169] MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube
	W0921 14:26:40.555311    4020 out.go:272] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I0921 14:26:40.555739    4020 config.go:180] Loaded profile config "download-only-20220921142630-3535": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.16.0
	W0921 14:26:40.555783    4020 start.go:716] api.Load failed for download-only-20220921142630-3535: filestore "download-only-20220921142630-3535": Docker machine "download-only-20220921142630-3535" does not exist. Use "docker-machine ls" to list machines. Use "docker-machine create" to add a new one.
	I0921 14:26:40.555827    4020 driver.go:365] Setting default libvirt URI to qemu:///system
	W0921 14:26:40.555849    4020 start.go:716] api.Load failed for download-only-20220921142630-3535: filestore "download-only-20220921142630-3535": Docker machine "download-only-20220921142630-3535" does not exist. Use "docker-machine ls" to list machines. Use "docker-machine create" to add a new one.
	I0921 14:26:40.582448    4020 out.go:97] Using the hyperkit driver based on existing profile
	I0921 14:26:40.582464    4020 start.go:284] selected driver: hyperkit
	I0921 14:26:40.582469    4020 start.go:808] validating driver "hyperkit" against &{Name:download-only-20220921142630-3535 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube/iso/minikube-v1.27.0-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase:v0.0.34@sha256:f2a1e577e43fd6769f35cdb938f6d21c3dacfd763062d119cade738fa244720c Memory:6000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.16.0 Cl
usterName:download-only-20220921142630-3535 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.16.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:fals
e DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath:/opt/socket_vmnet/bin/socket_vmnet_client SocketVMnetPath:/var/run/socket_vmnet}
	I0921 14:26:40.582594    4020 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0921 14:26:40.582699    4020 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0921 14:26:40.589014    4020 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.27.0
	I0921 14:26:40.592066    4020 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 14:26:40.592094    4020 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0921 14:26:40.593972    4020 cni.go:95] Creating CNI manager for ""
	I0921 14:26:40.593987    4020 cni.go:169] CNI unnecessary in this configuration, recommending no CNI
	I0921 14:26:40.594002    4020 start_flags.go:316] config:
	{Name:download-only-20220921142630-3535 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube/iso/minikube-v1.27.0-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase:v0.0.34@sha256:f2a1e577e43fd6769f35cdb938f6d21c3dacfd763062d119cade738fa244720c Memory:6000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.25.2 ClusterName:download-only-20220921142630-3535 Namespace:default APIServerName:miniku
beCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.16.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath:/opt/socket_v
mnet/bin/socket_vmnet_client SocketVMnetPath:/var/run/socket_vmnet}
	I0921 14:26:40.594117    4020 iso.go:124] acquiring lock: {Name:mke8c57399926d29e846b47dd4be4625ba5fcaea Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0921 14:26:40.615452    4020 out.go:97] Starting control plane node download-only-20220921142630-3535 in cluster download-only-20220921142630-3535
	I0921 14:26:40.615465    4020 preload.go:132] Checking if preload exists for k8s version v1.25.2 and runtime docker
	I0921 14:26:40.674687    4020 preload.go:119] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.25.2/preloaded-images-k8s-v18-v1.25.2-docker-overlay2-amd64.tar.lz4
	I0921 14:26:40.674710    4020 cache.go:57] Caching tarball of preloaded images
	I0921 14:26:40.674891    4020 preload.go:132] Checking if preload exists for k8s version v1.25.2 and runtime docker
	I0921 14:26:40.696314    4020 out.go:97] Downloading Kubernetes v1.25.2 preload ...
	I0921 14:26:40.696323    4020 preload.go:238] getting checksum for preloaded-images-k8s-v18-v1.25.2-docker-overlay2-amd64.tar.lz4 ...
	I0921 14:26:40.777028    4020 download.go:101] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.25.2/preloaded-images-k8s-v18-v1.25.2-docker-overlay2-amd64.tar.lz4?checksum=md5:b0e374b6adbebc5b5e0cfc12622b2408 -> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.25.2-docker-overlay2-amd64.tar.lz4
	I0921 14:26:45.354243    4020 preload.go:249] saving checksum for preloaded-images-k8s-v18-v1.25.2-docker-overlay2-amd64.tar.lz4 ...
	I0921 14:26:45.354396    4020 preload.go:256] verifying checksum of /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.25.2-docker-overlay2-amd64.tar.lz4 ...
	
	* 
	* The control plane node "" does not exist.
	  To start a cluster, run: "minikube start -p download-only-20220921142630-3535"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:174: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.25.2/LogsDuration (0.33s)

                                                
                                    
x
+
TestDownloadOnly/DeleteAll (0.44s)

                                                
                                                
=== RUN   TestDownloadOnly/DeleteAll
aaa_download_only_test.go:191: (dbg) Run:  out/minikube-darwin-amd64 delete --all
--- PASS: TestDownloadOnly/DeleteAll (0.44s)

                                                
                                    
x
+
TestDownloadOnly/DeleteAlwaysSucceeds (0.42s)

                                                
                                                
=== RUN   TestDownloadOnly/DeleteAlwaysSucceeds
aaa_download_only_test.go:203: (dbg) Run:  out/minikube-darwin-amd64 delete -p download-only-20220921142630-3535
--- PASS: TestDownloadOnly/DeleteAlwaysSucceeds (0.42s)

                                                
                                    
x
+
TestBinaryMirror (1.05s)

                                                
                                                
=== RUN   TestBinaryMirror
aaa_download_only_test.go:310: (dbg) Run:  out/minikube-darwin-amd64 start --download-only -p binary-mirror-20220921142648-3535 --alsologtostderr --binary-mirror http://127.0.0.1:49374 --driver=hyperkit 
helpers_test.go:175: Cleaning up "binary-mirror-20220921142648-3535" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p binary-mirror-20220921142648-3535
--- PASS: TestBinaryMirror (1.05s)

                                                
                                    
x
+
TestOffline (61.43s)

                                                
                                                
=== RUN   TestOffline
=== PAUSE TestOffline

                                                
                                                

                                                
                                                
=== CONT  TestOffline

                                                
                                                
=== CONT  TestOffline
aab_offline_test.go:55: (dbg) Run:  out/minikube-darwin-amd64 start -p offline-docker-20220921151637-3535 --alsologtostderr -v=1 --memory=2048 --wait=true --driver=hyperkit 

                                                
                                                
=== CONT  TestOffline
aab_offline_test.go:55: (dbg) Done: out/minikube-darwin-amd64 start -p offline-docker-20220921151637-3535 --alsologtostderr -v=1 --memory=2048 --wait=true --driver=hyperkit : (57.969436735s)
helpers_test.go:175: Cleaning up "offline-docker-20220921151637-3535" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p offline-docker-20220921151637-3535
E0921 15:17:36.647143    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory

                                                
                                                
=== CONT  TestOffline
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p offline-docker-20220921151637-3535: (3.465187735s)
--- PASS: TestOffline (61.43s)

                                                
                                    
x
+
TestAddons/Setup (144.85s)

                                                
                                                
=== RUN   TestAddons/Setup
addons_test.go:76: (dbg) Run:  out/minikube-darwin-amd64 start -p addons-20220921142649-3535 --wait=true --memory=4000 --alsologtostderr --addons=registry --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --driver=hyperkit  --addons=ingress --addons=ingress-dns --addons=helm-tiller
addons_test.go:76: (dbg) Done: out/minikube-darwin-amd64 start -p addons-20220921142649-3535 --wait=true --memory=4000 --alsologtostderr --addons=registry --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --driver=hyperkit  --addons=ingress --addons=ingress-dns --addons=helm-tiller: (2m24.847018211s)
--- PASS: TestAddons/Setup (144.85s)

                                                
                                    
x
+
TestAddons/parallel/Registry (18.52s)

                                                
                                                
=== RUN   TestAddons/parallel/Registry
=== PAUSE TestAddons/parallel/Registry

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Registry

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:282: registry stabilized in 7.257615ms
addons_test.go:284: (dbg) TestAddons/parallel/Registry: waiting 6m0s for pods matching "actual-registry=true" in namespace "kube-system" ...
helpers_test.go:342: "registry-6zp9m" [4c94a8ff-4d3f-4538-8dac-5c18654b00df] Running

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:284: (dbg) TestAddons/parallel/Registry: actual-registry=true healthy within 5.005453621s
addons_test.go:287: (dbg) TestAddons/parallel/Registry: waiting 10m0s for pods matching "registry-proxy=true" in namespace "kube-system" ...
helpers_test.go:342: "registry-proxy-qpc6v" [6e2a9424-348c-4b39-8866-748a14a27d04] Running

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:287: (dbg) TestAddons/parallel/Registry: registry-proxy=true healthy within 5.006300225s
addons_test.go:292: (dbg) Run:  kubectl --context addons-20220921142649-3535 delete po -l run=registry-test --now
addons_test.go:297: (dbg) Run:  kubectl --context addons-20220921142649-3535 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local"

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:297: (dbg) Done: kubectl --context addons-20220921142649-3535 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local": (7.931112214s)
addons_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220921142649-3535 ip
2022/09/21 14:29:32 [DEBUG] GET http://192.168.64.2:5000
addons_test.go:340: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220921142649-3535 addons disable registry --alsologtostderr -v=1
--- PASS: TestAddons/parallel/Registry (18.52s)

                                                
                                    
x
+
TestAddons/parallel/Ingress (23.71s)

                                                
                                                
=== RUN   TestAddons/parallel/Ingress
=== PAUSE TestAddons/parallel/Ingress

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Ingress
addons_test.go:164: (dbg) Run:  kubectl --context addons-20220921142649-3535 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s
addons_test.go:184: (dbg) Run:  kubectl --context addons-20220921142649-3535 replace --force -f testdata/nginx-ingress-v1.yaml
addons_test.go:197: (dbg) Run:  kubectl --context addons-20220921142649-3535 replace --force -f testdata/nginx-pod-svc.yaml
addons_test.go:202: (dbg) TestAddons/parallel/Ingress: waiting 8m0s for pods matching "run=nginx" in namespace "default" ...
helpers_test.go:342: "nginx" [60beff1d-17f4-4fd7-9a06-da50fe461475] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])

                                                
                                                
=== CONT  TestAddons/parallel/Ingress
helpers_test.go:342: "nginx" [60beff1d-17f4-4fd7-9a06-da50fe461475] Running

                                                
                                                
=== CONT  TestAddons/parallel/Ingress
addons_test.go:202: (dbg) TestAddons/parallel/Ingress: run=nginx healthy within 14.005012113s
addons_test.go:214: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220921142649-3535 ssh "curl -s http://127.0.0.1/ -H 'Host: nginx.example.com'"
addons_test.go:238: (dbg) Run:  kubectl --context addons-20220921142649-3535 replace --force -f testdata/ingress-dns-example-v1.yaml
addons_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220921142649-3535 ip

                                                
                                                
=== CONT  TestAddons/parallel/Ingress
addons_test.go:249: (dbg) Run:  nslookup hello-john.test 192.168.64.2
addons_test.go:258: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220921142649-3535 addons disable ingress-dns --alsologtostderr -v=1
addons_test.go:263: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220921142649-3535 addons disable ingress --alsologtostderr -v=1

                                                
                                                
=== CONT  TestAddons/parallel/Ingress
addons_test.go:263: (dbg) Done: out/minikube-darwin-amd64 -p addons-20220921142649-3535 addons disable ingress --alsologtostderr -v=1: (7.381775101s)
--- PASS: TestAddons/parallel/Ingress (23.71s)

                                                
                                    
x
+
TestAddons/parallel/MetricsServer (5.43s)

                                                
                                                
=== RUN   TestAddons/parallel/MetricsServer
=== PAUSE TestAddons/parallel/MetricsServer

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/MetricsServer
addons_test.go:359: metrics-server stabilized in 3.304913ms
addons_test.go:361: (dbg) TestAddons/parallel/MetricsServer: waiting 6m0s for pods matching "k8s-app=metrics-server" in namespace "kube-system" ...
helpers_test.go:342: "metrics-server-769cd898cd-8nsp8" [0d066965-c447-4e9a-b13d-4b9cf477d366] Running
addons_test.go:361: (dbg) TestAddons/parallel/MetricsServer: k8s-app=metrics-server healthy within 5.008063412s
addons_test.go:367: (dbg) Run:  kubectl --context addons-20220921142649-3535 top pods -n kube-system
addons_test.go:384: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220921142649-3535 addons disable metrics-server --alsologtostderr -v=1
--- PASS: TestAddons/parallel/MetricsServer (5.43s)

                                                
                                    
x
+
TestAddons/parallel/HelmTiller (11.36s)

                                                
                                                
=== RUN   TestAddons/parallel/HelmTiller
=== PAUSE TestAddons/parallel/HelmTiller

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/HelmTiller
addons_test.go:408: tiller-deploy stabilized in 1.43826ms
addons_test.go:410: (dbg) TestAddons/parallel/HelmTiller: waiting 6m0s for pods matching "app=helm" in namespace "kube-system" ...
helpers_test.go:342: "tiller-deploy-696b5bfbb7-6j9dl" [c79a2bc2-e0cd-4a03-a20c-d053caaa7e7a] Running
addons_test.go:410: (dbg) TestAddons/parallel/HelmTiller: app=helm healthy within 5.010873125s
addons_test.go:425: (dbg) Run:  kubectl --context addons-20220921142649-3535 run --rm helm-test --restart=Never --image=alpine/helm:2.16.3 -it --namespace=kube-system -- version

                                                
                                                
=== CONT  TestAddons/parallel/HelmTiller
addons_test.go:425: (dbg) Done: kubectl --context addons-20220921142649-3535 run --rm helm-test --restart=Never --image=alpine/helm:2.16.3 -it --namespace=kube-system -- version: (5.99015413s)
addons_test.go:442: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220921142649-3535 addons disable helm-tiller --alsologtostderr -v=1
--- PASS: TestAddons/parallel/HelmTiller (11.36s)

                                                
                                    
x
+
TestAddons/parallel/CSI (47.21s)

                                                
                                                
=== RUN   TestAddons/parallel/CSI
=== PAUSE TestAddons/parallel/CSI

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CSI
addons_test.go:513: csi-hostpath-driver pods stabilized in 5.554461ms
addons_test.go:516: (dbg) Run:  kubectl --context addons-20220921142649-3535 create -f testdata/csi-hostpath-driver/pvc.yaml
addons_test.go:521: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc" in namespace "default" ...
helpers_test.go:392: (dbg) Run:  kubectl --context addons-20220921142649-3535 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:392: (dbg) Run:  kubectl --context addons-20220921142649-3535 get pvc hpvc -o jsonpath={.status.phase} -n default
addons_test.go:526: (dbg) Run:  kubectl --context addons-20220921142649-3535 create -f testdata/csi-hostpath-driver/pv-pod.yaml
addons_test.go:531: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod" in namespace "default" ...
helpers_test.go:342: "task-pv-pod" [06776254-0432-49ad-bd3b-8775713898d5] Pending
helpers_test.go:342: "task-pv-pod" [06776254-0432-49ad-bd3b-8775713898d5] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])

                                                
                                                
=== CONT  TestAddons/parallel/CSI
helpers_test.go:342: "task-pv-pod" [06776254-0432-49ad-bd3b-8775713898d5] Running

                                                
                                                
=== CONT  TestAddons/parallel/CSI
addons_test.go:531: (dbg) TestAddons/parallel/CSI: app=task-pv-pod healthy within 20.008304732s
addons_test.go:536: (dbg) Run:  kubectl --context addons-20220921142649-3535 create -f testdata/csi-hostpath-driver/snapshot.yaml
addons_test.go:541: (dbg) TestAddons/parallel/CSI: waiting 6m0s for volume snapshot "new-snapshot-demo" in namespace "default" ...
helpers_test.go:417: (dbg) Run:  kubectl --context addons-20220921142649-3535 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
helpers_test.go:417: (dbg) Run:  kubectl --context addons-20220921142649-3535 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
addons_test.go:546: (dbg) Run:  kubectl --context addons-20220921142649-3535 delete pod task-pv-pod
addons_test.go:552: (dbg) Run:  kubectl --context addons-20220921142649-3535 delete pvc hpvc
addons_test.go:558: (dbg) Run:  kubectl --context addons-20220921142649-3535 create -f testdata/csi-hostpath-driver/pvc-restore.yaml
addons_test.go:563: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc-restore" in namespace "default" ...
helpers_test.go:392: (dbg) Run:  kubectl --context addons-20220921142649-3535 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
addons_test.go:568: (dbg) Run:  kubectl --context addons-20220921142649-3535 create -f testdata/csi-hostpath-driver/pv-pod-restore.yaml
addons_test.go:573: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod-restore" in namespace "default" ...
helpers_test.go:342: "task-pv-pod-restore" [98a08339-f09d-42f4-aba0-cea9732974f8] Pending

                                                
                                                
=== CONT  TestAddons/parallel/CSI
helpers_test.go:342: "task-pv-pod-restore" [98a08339-f09d-42f4-aba0-cea9732974f8] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])

                                                
                                                
=== CONT  TestAddons/parallel/CSI
helpers_test.go:342: "task-pv-pod-restore" [98a08339-f09d-42f4-aba0-cea9732974f8] Running

                                                
                                                
=== CONT  TestAddons/parallel/CSI
addons_test.go:573: (dbg) TestAddons/parallel/CSI: app=task-pv-pod-restore healthy within 15.013702499s
addons_test.go:578: (dbg) Run:  kubectl --context addons-20220921142649-3535 delete pod task-pv-pod-restore
addons_test.go:578: (dbg) Done: kubectl --context addons-20220921142649-3535 delete pod task-pv-pod-restore: (1.142403461s)
addons_test.go:582: (dbg) Run:  kubectl --context addons-20220921142649-3535 delete pvc hpvc-restore
addons_test.go:586: (dbg) Run:  kubectl --context addons-20220921142649-3535 delete volumesnapshot new-snapshot-demo
addons_test.go:590: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220921142649-3535 addons disable csi-hostpath-driver --alsologtostderr -v=1
addons_test.go:590: (dbg) Done: out/minikube-darwin-amd64 -p addons-20220921142649-3535 addons disable csi-hostpath-driver --alsologtostderr -v=1: (6.586107928s)
addons_test.go:594: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220921142649-3535 addons disable volumesnapshots --alsologtostderr -v=1
--- PASS: TestAddons/parallel/CSI (47.21s)

                                                
                                    
x
+
TestAddons/parallel/Headlamp (11.14s)

                                                
                                                
=== RUN   TestAddons/parallel/Headlamp
=== PAUSE TestAddons/parallel/Headlamp

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Headlamp
addons_test.go:737: (dbg) Run:  out/minikube-darwin-amd64 addons enable headlamp -p addons-20220921142649-3535 --alsologtostderr -v=1

                                                
                                                
=== CONT  TestAddons/parallel/Headlamp
addons_test.go:737: (dbg) Done: out/minikube-darwin-amd64 addons enable headlamp -p addons-20220921142649-3535 --alsologtostderr -v=1: (1.128035958s)
addons_test.go:742: (dbg) TestAddons/parallel/Headlamp: waiting 8m0s for pods matching "app.kubernetes.io/name=headlamp" in namespace "headlamp" ...
helpers_test.go:342: "headlamp-788c8d94dd-nfjgn" [54d69ba6-f8ae-444b-8946-5ba0879917ee] Pending
helpers_test.go:342: "headlamp-788c8d94dd-nfjgn" [54d69ba6-f8ae-444b-8946-5ba0879917ee] Pending / Ready:ContainersNotReady (containers with unready status: [headlamp]) / ContainersReady:ContainersNotReady (containers with unready status: [headlamp])

                                                
                                                
=== CONT  TestAddons/parallel/Headlamp
helpers_test.go:342: "headlamp-788c8d94dd-nfjgn" [54d69ba6-f8ae-444b-8946-5ba0879917ee] Running

                                                
                                                
=== CONT  TestAddons/parallel/Headlamp
addons_test.go:742: (dbg) TestAddons/parallel/Headlamp: app.kubernetes.io/name=headlamp healthy within 10.006493139s
--- PASS: TestAddons/parallel/Headlamp (11.14s)

                                                
                                    
x
+
TestAddons/serial/GCPAuth (19.37s)

                                                
                                                
=== RUN   TestAddons/serial/GCPAuth
addons_test.go:605: (dbg) Run:  kubectl --context addons-20220921142649-3535 create -f testdata/busybox.yaml
addons_test.go:612: (dbg) Run:  kubectl --context addons-20220921142649-3535 create sa gcp-auth-test
addons_test.go:618: (dbg) TestAddons/serial/GCPAuth: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:342: "busybox" [2c00d561-0ec4-4690-8c8f-8f197cc3674d] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:342: "busybox" [2c00d561-0ec4-4690-8c8f-8f197cc3674d] Running
addons_test.go:618: (dbg) TestAddons/serial/GCPAuth: integration-test=busybox healthy within 13.005605184s
addons_test.go:624: (dbg) Run:  kubectl --context addons-20220921142649-3535 exec busybox -- /bin/sh -c "printenv GOOGLE_APPLICATION_CREDENTIALS"
addons_test.go:636: (dbg) Run:  kubectl --context addons-20220921142649-3535 describe sa gcp-auth-test
addons_test.go:650: (dbg) Run:  kubectl --context addons-20220921142649-3535 exec busybox -- /bin/sh -c "cat /google-app-creds.json"
addons_test.go:674: (dbg) Run:  kubectl --context addons-20220921142649-3535 exec busybox -- /bin/sh -c "printenv GOOGLE_CLOUD_PROJECT"
addons_test.go:687: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220921142649-3535 addons disable gcp-auth --alsologtostderr -v=1
addons_test.go:687: (dbg) Done: out/minikube-darwin-amd64 -p addons-20220921142649-3535 addons disable gcp-auth --alsologtostderr -v=1: (5.844939252s)
--- PASS: TestAddons/serial/GCPAuth (19.37s)

                                                
                                    
x
+
TestAddons/StoppedEnableDisable (3.56s)

                                                
                                                
=== RUN   TestAddons/StoppedEnableDisable
addons_test.go:134: (dbg) Run:  out/minikube-darwin-amd64 stop -p addons-20220921142649-3535
addons_test.go:134: (dbg) Done: out/minikube-darwin-amd64 stop -p addons-20220921142649-3535: (3.224621473s)
addons_test.go:138: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p addons-20220921142649-3535
addons_test.go:142: (dbg) Run:  out/minikube-darwin-amd64 addons disable dashboard -p addons-20220921142649-3535
--- PASS: TestAddons/StoppedEnableDisable (3.56s)

                                                
                                    
x
+
TestCertOptions (40.76s)

                                                
                                                
=== RUN   TestCertOptions
=== PAUSE TestCertOptions

                                                
                                                

                                                
                                                
=== CONT  TestCertOptions
cert_options_test.go:49: (dbg) Run:  out/minikube-darwin-amd64 start -p cert-options-20220921151837-3535 --memory=2048 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=hyperkit 

                                                
                                                
=== CONT  TestCertOptions
cert_options_test.go:49: (dbg) Done: out/minikube-darwin-amd64 start -p cert-options-20220921151837-3535 --memory=2048 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=hyperkit : (37.095619923s)
cert_options_test.go:60: (dbg) Run:  out/minikube-darwin-amd64 -p cert-options-20220921151837-3535 ssh "openssl x509 -text -noout -in /var/lib/minikube/certs/apiserver.crt"
E0921 15:19:14.486345    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
cert_options_test.go:88: (dbg) Run:  kubectl --context cert-options-20220921151837-3535 config view
cert_options_test.go:100: (dbg) Run:  out/minikube-darwin-amd64 ssh -p cert-options-20220921151837-3535 -- "sudo cat /etc/kubernetes/admin.conf"
helpers_test.go:175: Cleaning up "cert-options-20220921151837-3535" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p cert-options-20220921151837-3535
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p cert-options-20220921151837-3535: (3.335370047s)
--- PASS: TestCertOptions (40.76s)

                                                
                                    
x
+
TestCertExpiration (251.37s)

                                                
                                                
=== RUN   TestCertExpiration
=== PAUSE TestCertExpiration

                                                
                                                

                                                
                                                
=== CONT  TestCertExpiration
cert_options_test.go:123: (dbg) Run:  out/minikube-darwin-amd64 start -p cert-expiration-20220921151821-3535 --memory=2048 --cert-expiration=3m --driver=hyperkit 

                                                
                                                
=== CONT  TestCertExpiration
cert_options_test.go:123: (dbg) Done: out/minikube-darwin-amd64 start -p cert-expiration-20220921151821-3535 --memory=2048 --cert-expiration=3m --driver=hyperkit : (39.45816849s)

                                                
                                                
=== CONT  TestCertExpiration
cert_options_test.go:131: (dbg) Run:  out/minikube-darwin-amd64 start -p cert-expiration-20220921151821-3535 --memory=2048 --cert-expiration=8760h --driver=hyperkit 
E0921 15:22:03.108624    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/skaffold-20220921151524-3535/client.crt: no such file or directory
cert_options_test.go:131: (dbg) Done: out/minikube-darwin-amd64 start -p cert-expiration-20220921151821-3535 --memory=2048 --cert-expiration=8760h --driver=hyperkit : (26.641057819s)
helpers_test.go:175: Cleaning up "cert-expiration-20220921151821-3535" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p cert-expiration-20220921151821-3535
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p cert-expiration-20220921151821-3535: (5.271109515s)
--- PASS: TestCertExpiration (251.37s)

                                                
                                    
x
+
TestDockerFlags (58.39s)

                                                
                                                
=== RUN   TestDockerFlags
=== PAUSE TestDockerFlags

                                                
                                                

                                                
                                                
=== CONT  TestDockerFlags
docker_test.go:45: (dbg) Run:  out/minikube-darwin-amd64 start -p docker-flags-20220921151738-3535 --cache-images=false --memory=2048 --install-addons=false --wait=false --docker-env=FOO=BAR --docker-env=BAZ=BAT --docker-opt=debug --docker-opt=icc=true --alsologtostderr -v=5 --driver=hyperkit 

                                                
                                                
=== CONT  TestDockerFlags
docker_test.go:45: (dbg) Done: out/minikube-darwin-amd64 start -p docker-flags-20220921151738-3535 --cache-images=false --memory=2048 --install-addons=false --wait=false --docker-env=FOO=BAR --docker-env=BAZ=BAT --docker-opt=debug --docker-opt=icc=true --alsologtostderr -v=5 --driver=hyperkit : (52.723103351s)
docker_test.go:50: (dbg) Run:  out/minikube-darwin-amd64 -p docker-flags-20220921151738-3535 ssh "sudo systemctl show docker --property=Environment --no-pager"
docker_test.go:61: (dbg) Run:  out/minikube-darwin-amd64 -p docker-flags-20220921151738-3535 ssh "sudo systemctl show docker --property=ExecStart --no-pager"
helpers_test.go:175: Cleaning up "docker-flags-20220921151738-3535" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p docker-flags-20220921151738-3535
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p docker-flags-20220921151738-3535: (5.269229597s)
--- PASS: TestDockerFlags (58.39s)

                                                
                                    
x
+
TestForceSystemdFlag (44.69s)

                                                
                                                
=== RUN   TestForceSystemdFlag
=== PAUSE TestForceSystemdFlag

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdFlag
docker_test.go:85: (dbg) Run:  out/minikube-darwin-amd64 start -p force-systemd-flag-20220921151737-3535 --memory=2048 --force-systemd --alsologtostderr -v=5 --driver=hyperkit 

                                                
                                                
=== CONT  TestForceSystemdFlag
docker_test.go:85: (dbg) Done: out/minikube-darwin-amd64 start -p force-systemd-flag-20220921151737-3535 --memory=2048 --force-systemd --alsologtostderr -v=5 --driver=hyperkit : (41.079907333s)
docker_test.go:104: (dbg) Run:  out/minikube-darwin-amd64 -p force-systemd-flag-20220921151737-3535 ssh "docker info --format {{.CgroupDriver}}"
helpers_test.go:175: Cleaning up "force-systemd-flag-20220921151737-3535" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p force-systemd-flag-20220921151737-3535
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p force-systemd-flag-20220921151737-3535: (3.454965753s)
--- PASS: TestForceSystemdFlag (44.69s)

                                                
                                    
x
+
TestForceSystemdEnv (43.01s)

                                                
                                                
=== RUN   TestForceSystemdEnv
=== PAUSE TestForceSystemdEnv

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdEnv

                                                
                                                
=== CONT  TestForceSystemdEnv
docker_test.go:149: (dbg) Run:  out/minikube-darwin-amd64 start -p force-systemd-env-20220921151654-3535 --memory=2048 --alsologtostderr -v=5 --driver=hyperkit 
docker_test.go:149: (dbg) Done: out/minikube-darwin-amd64 start -p force-systemd-env-20220921151654-3535 --memory=2048 --alsologtostderr -v=5 --driver=hyperkit : (39.412262669s)
docker_test.go:104: (dbg) Run:  out/minikube-darwin-amd64 -p force-systemd-env-20220921151654-3535 ssh "docker info --format {{.CgroupDriver}}"
helpers_test.go:175: Cleaning up "force-systemd-env-20220921151654-3535" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p force-systemd-env-20220921151654-3535

                                                
                                                
=== CONT  TestForceSystemdEnv
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p force-systemd-env-20220921151654-3535: (3.432333115s)
--- PASS: TestForceSystemdEnv (43.01s)

                                                
                                    
x
+
TestHyperKitDriverInstallOrUpdate (6.71s)

                                                
                                                
=== RUN   TestHyperKitDriverInstallOrUpdate
=== PAUSE TestHyperKitDriverInstallOrUpdate

                                                
                                                

                                                
                                                
=== CONT  TestHyperKitDriverInstallOrUpdate
--- PASS: TestHyperKitDriverInstallOrUpdate (6.71s)

                                                
                                    
x
+
TestErrorSpam/setup (52.58s)

                                                
                                                
=== RUN   TestErrorSpam/setup
error_spam_test.go:81: (dbg) Run:  out/minikube-darwin-amd64 start -p nospam-20220921143043-3535 -n=1 --memory=2250 --wait=false --log_dir=/var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-20220921143043-3535 --driver=hyperkit 
error_spam_test.go:81: (dbg) Done: out/minikube-darwin-amd64 start -p nospam-20220921143043-3535 -n=1 --memory=2250 --wait=false --log_dir=/var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-20220921143043-3535 --driver=hyperkit : (52.580188112s)
--- PASS: TestErrorSpam/setup (52.58s)

                                                
                                    
x
+
TestErrorSpam/start (1.2s)

                                                
                                                
=== RUN   TestErrorSpam/start
error_spam_test.go:216: Cleaning up 1 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220921143043-3535 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-20220921143043-3535 start --dry-run
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220921143043-3535 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-20220921143043-3535 start --dry-run
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220921143043-3535 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-20220921143043-3535 start --dry-run
--- PASS: TestErrorSpam/start (1.20s)

                                                
                                    
x
+
TestErrorSpam/status (0.45s)

                                                
                                                
=== RUN   TestErrorSpam/status
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220921143043-3535 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-20220921143043-3535 status
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220921143043-3535 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-20220921143043-3535 status
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220921143043-3535 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-20220921143043-3535 status
--- PASS: TestErrorSpam/status (0.45s)

                                                
                                    
x
+
TestErrorSpam/pause (1.26s)

                                                
                                                
=== RUN   TestErrorSpam/pause
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220921143043-3535 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-20220921143043-3535 pause
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220921143043-3535 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-20220921143043-3535 pause
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220921143043-3535 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-20220921143043-3535 pause
--- PASS: TestErrorSpam/pause (1.26s)

                                                
                                    
x
+
TestErrorSpam/unpause (1.31s)

                                                
                                                
=== RUN   TestErrorSpam/unpause
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220921143043-3535 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-20220921143043-3535 unpause
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220921143043-3535 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-20220921143043-3535 unpause
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220921143043-3535 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-20220921143043-3535 unpause
--- PASS: TestErrorSpam/unpause (1.31s)

                                                
                                    
x
+
TestErrorSpam/stop (3.64s)

                                                
                                                
=== RUN   TestErrorSpam/stop
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220921143043-3535 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-20220921143043-3535 stop
error_spam_test.go:159: (dbg) Done: out/minikube-darwin-amd64 -p nospam-20220921143043-3535 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-20220921143043-3535 stop: (3.225190003s)
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220921143043-3535 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-20220921143043-3535 stop
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220921143043-3535 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-20220921143043-3535 stop
--- PASS: TestErrorSpam/stop (3.64s)

                                                
                                    
x
+
TestFunctional/serial/CopySyncFile (0s)

                                                
                                                
=== RUN   TestFunctional/serial/CopySyncFile
functional_test.go:1781: local sync path: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/files/etc/test/nested/copy/3535/hosts
--- PASS: TestFunctional/serial/CopySyncFile (0.00s)

                                                
                                    
x
+
TestFunctional/serial/StartWithProxy (64.5s)

                                                
                                                
=== RUN   TestFunctional/serial/StartWithProxy
functional_test.go:2160: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-20220921143144-3535 --memory=4000 --apiserver-port=8441 --wait=all --driver=hyperkit 
functional_test.go:2160: (dbg) Done: out/minikube-darwin-amd64 start -p functional-20220921143144-3535 --memory=4000 --apiserver-port=8441 --wait=all --driver=hyperkit : (1m4.5000471s)
--- PASS: TestFunctional/serial/StartWithProxy (64.50s)

                                                
                                    
x
+
TestFunctional/serial/AuditLog (0s)

                                                
                                                
=== RUN   TestFunctional/serial/AuditLog
--- PASS: TestFunctional/serial/AuditLog (0.00s)

                                                
                                    
x
+
TestFunctional/serial/SoftStart (52.42s)

                                                
                                                
=== RUN   TestFunctional/serial/SoftStart
functional_test.go:651: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-20220921143144-3535 --alsologtostderr -v=8
functional_test.go:651: (dbg) Done: out/minikube-darwin-amd64 start -p functional-20220921143144-3535 --alsologtostderr -v=8: (52.420447476s)
functional_test.go:655: soft start took 52.420903117s for "functional-20220921143144-3535" cluster.
--- PASS: TestFunctional/serial/SoftStart (52.42s)

                                                
                                    
x
+
TestFunctional/serial/KubeContext (0.03s)

                                                
                                                
=== RUN   TestFunctional/serial/KubeContext
functional_test.go:673: (dbg) Run:  kubectl config current-context
--- PASS: TestFunctional/serial/KubeContext (0.03s)

                                                
                                    
x
+
TestFunctional/serial/KubectlGetPods (0.05s)

                                                
                                                
=== RUN   TestFunctional/serial/KubectlGetPods
functional_test.go:688: (dbg) Run:  kubectl --context functional-20220921143144-3535 get po -A
--- PASS: TestFunctional/serial/KubectlGetPods (0.05s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_remote (10.39s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_remote
functional_test.go:1041: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 cache add k8s.gcr.io/pause:3.1
functional_test.go:1041: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220921143144-3535 cache add k8s.gcr.io/pause:3.1: (3.47283298s)
functional_test.go:1041: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 cache add k8s.gcr.io/pause:3.3
functional_test.go:1041: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220921143144-3535 cache add k8s.gcr.io/pause:3.3: (3.523344333s)
functional_test.go:1041: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 cache add k8s.gcr.io/pause:latest
functional_test.go:1041: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220921143144-3535 cache add k8s.gcr.io/pause:latest: (3.391404215s)
--- PASS: TestFunctional/serial/CacheCmd/cache/add_remote (10.39s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_local (1.59s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_local
functional_test.go:1069: (dbg) Run:  docker build -t minikube-local-cache-test:functional-20220921143144-3535 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalserialCacheCmdcacheadd_local736804028/001
functional_test.go:1081: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 cache add minikube-local-cache-test:functional-20220921143144-3535
functional_test.go:1081: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220921143144-3535 cache add minikube-local-cache-test:functional-20220921143144-3535: (1.079088644s)
functional_test.go:1086: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 cache delete minikube-local-cache-test:functional-20220921143144-3535
functional_test.go:1075: (dbg) Run:  docker rmi minikube-local-cache-test:functional-20220921143144-3535
--- PASS: TestFunctional/serial/CacheCmd/cache/add_local (1.59s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/delete_k8s.gcr.io/pause:3.3 (0.07s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/delete_k8s.gcr.io/pause:3.3
functional_test.go:1094: (dbg) Run:  out/minikube-darwin-amd64 cache delete k8s.gcr.io/pause:3.3
--- PASS: TestFunctional/serial/CacheCmd/cache/delete_k8s.gcr.io/pause:3.3 (0.07s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/list (0.07s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/list
functional_test.go:1102: (dbg) Run:  out/minikube-darwin-amd64 cache list
--- PASS: TestFunctional/serial/CacheCmd/cache/list (0.07s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.16s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node
functional_test.go:1116: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh sudo crictl images
--- PASS: TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.16s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/cache_reload (2.3s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/cache_reload
functional_test.go:1139: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh sudo docker rmi k8s.gcr.io/pause:latest
functional_test.go:1145: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh sudo crictl inspecti k8s.gcr.io/pause:latest
functional_test.go:1145: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh sudo crictl inspecti k8s.gcr.io/pause:latest: exit status 1 (126.495116ms)

                                                
                                                
-- stdout --
	FATA[0000] no such image "k8s.gcr.io/pause:latest" present 

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:1150: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 cache reload
functional_test.go:1150: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220921143144-3535 cache reload: (1.86373638s)
functional_test.go:1155: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh sudo crictl inspecti k8s.gcr.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/cache_reload (2.30s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/delete (0.15s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/delete
functional_test.go:1164: (dbg) Run:  out/minikube-darwin-amd64 cache delete k8s.gcr.io/pause:3.1
functional_test.go:1164: (dbg) Run:  out/minikube-darwin-amd64 cache delete k8s.gcr.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/delete (0.15s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmd (0.49s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmd
functional_test.go:708: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 kubectl -- --context functional-20220921143144-3535 get pods
--- PASS: TestFunctional/serial/MinikubeKubectlCmd (0.49s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmdDirectly (0.64s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmdDirectly
functional_test.go:733: (dbg) Run:  out/kubectl --context functional-20220921143144-3535 get pods
--- PASS: TestFunctional/serial/MinikubeKubectlCmdDirectly (0.64s)

                                                
                                    
x
+
TestFunctional/serial/ExtraConfig (43.85s)

                                                
                                                
=== RUN   TestFunctional/serial/ExtraConfig
functional_test.go:749: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-20220921143144-3535 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all
E0921 14:34:14.450424    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
E0921 14:34:14.458210    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
E0921 14:34:14.469464    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
E0921 14:34:14.489865    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
E0921 14:34:14.530543    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
E0921 14:34:14.612827    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
E0921 14:34:14.775084    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
E0921 14:34:15.097281    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
E0921 14:34:15.751380    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
E0921 14:34:17.031603    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
E0921 14:34:19.593936    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
E0921 14:34:24.716271    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
E0921 14:34:34.956545    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
functional_test.go:749: (dbg) Done: out/minikube-darwin-amd64 start -p functional-20220921143144-3535 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all: (43.84494167s)
functional_test.go:753: restart took 43.84505072s for "functional-20220921143144-3535" cluster.
--- PASS: TestFunctional/serial/ExtraConfig (43.85s)

                                                
                                    
x
+
TestFunctional/serial/ComponentHealth (0.05s)

                                                
                                                
=== RUN   TestFunctional/serial/ComponentHealth
functional_test.go:802: (dbg) Run:  kubectl --context functional-20220921143144-3535 get po -l tier=control-plane -n kube-system -o=json
functional_test.go:817: etcd phase: Running
functional_test.go:827: etcd status: Ready
functional_test.go:817: kube-apiserver phase: Running
functional_test.go:827: kube-apiserver status: Ready
functional_test.go:817: kube-controller-manager phase: Running
functional_test.go:827: kube-controller-manager status: Ready
functional_test.go:817: kube-scheduler phase: Running
functional_test.go:827: kube-scheduler status: Ready
--- PASS: TestFunctional/serial/ComponentHealth (0.05s)

                                                
                                    
x
+
TestFunctional/serial/LogsCmd (2.59s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsCmd
functional_test.go:1228: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 logs
functional_test.go:1228: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220921143144-3535 logs: (2.58961191s)
--- PASS: TestFunctional/serial/LogsCmd (2.59s)

                                                
                                    
x
+
TestFunctional/serial/LogsFileCmd (2.69s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsFileCmd
functional_test.go:1242: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 logs --file /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalserialLogsFileCmd2126375748/001/logs.txt
functional_test.go:1242: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220921143144-3535 logs --file /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalserialLogsFileCmd2126375748/001/logs.txt: (2.684174584s)
--- PASS: TestFunctional/serial/LogsFileCmd (2.69s)

                                                
                                    
x
+
TestFunctional/parallel/ConfigCmd (0.5s)

                                                
                                                
=== RUN   TestFunctional/parallel/ConfigCmd
=== PAUSE TestFunctional/parallel/ConfigCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1191: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 config unset cpus

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1191: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 config get cpus

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1191: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-20220921143144-3535 config get cpus: exit status 14 (57.319501ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
functional_test.go:1191: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 config set cpus 2
functional_test.go:1191: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 config get cpus

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1191: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 config unset cpus

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1191: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 config get cpus
functional_test.go:1191: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-20220921143144-3535 config get cpus: exit status 14 (53.753419ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/ConfigCmd (0.50s)

                                                
                                    
x
+
TestFunctional/parallel/DashboardCmd (13.06s)

                                                
                                                
=== RUN   TestFunctional/parallel/DashboardCmd
=== PAUSE TestFunctional/parallel/DashboardCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DashboardCmd
functional_test.go:897: (dbg) daemon: [out/minikube-darwin-amd64 dashboard --url --port 36195 -p functional-20220921143144-3535 --alsologtostderr -v=1]

                                                
                                                
=== CONT  TestFunctional/parallel/DashboardCmd
functional_test.go:902: (dbg) stopping [out/minikube-darwin-amd64 dashboard --url --port 36195 -p functional-20220921143144-3535 --alsologtostderr -v=1] ...
helpers_test.go:506: unable to kill pid 5369: os: process already finished
--- PASS: TestFunctional/parallel/DashboardCmd (13.06s)

                                                
                                    
x
+
TestFunctional/parallel/DryRun (1.14s)

                                                
                                                
=== RUN   TestFunctional/parallel/DryRun
=== PAUSE TestFunctional/parallel/DryRun

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DryRun
functional_test.go:966: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-20220921143144-3535 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit 
functional_test.go:966: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p functional-20220921143144-3535 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit : exit status 23 (431.617324ms)

                                                
                                                
-- stdout --
	* [functional-20220921143144-3535] minikube v1.27.0 on Darwin 12.6
	  - MINIKUBE_LOCATION=14995
	  - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube
	* Using the hyperkit driver based on existing profile
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0921 14:35:44.483750    5339 out.go:296] Setting OutFile to fd 1 ...
	I0921 14:35:44.483939    5339 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0921 14:35:44.483944    5339 out.go:309] Setting ErrFile to fd 2...
	I0921 14:35:44.483947    5339 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0921 14:35:44.484049    5339 root.go:333] Updating PATH: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/bin
	I0921 14:35:44.484478    5339 out.go:303] Setting JSON to false
	I0921 14:35:44.500371    5339 start.go:115] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":2115,"bootTime":1663794029,"procs":377,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"12.6","kernelVersion":"21.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0921 14:35:44.500507    5339 start.go:123] gopshost.Virtualization returned error: not implemented yet
	I0921 14:35:44.522230    5339 out.go:177] * [functional-20220921143144-3535] minikube v1.27.0 on Darwin 12.6
	I0921 14:35:44.564043    5339 out.go:177]   - MINIKUBE_LOCATION=14995
	I0921 14:35:44.584828    5339 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/kubeconfig
	I0921 14:35:44.606266    5339 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0921 14:35:44.628225    5339 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0921 14:35:44.650130    5339 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube
	I0921 14:35:44.672723    5339 config.go:180] Loaded profile config "functional-20220921143144-3535": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.2
	I0921 14:35:44.673377    5339 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 14:35:44.673453    5339 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 14:35:44.680498    5339 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:50289
	I0921 14:35:44.680915    5339 main.go:134] libmachine: () Calling .GetVersion
	I0921 14:35:44.681334    5339 main.go:134] libmachine: Using API Version  1
	I0921 14:35:44.681345    5339 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 14:35:44.681542    5339 main.go:134] libmachine: () Calling .GetMachineName
	I0921 14:35:44.681653    5339 main.go:134] libmachine: (functional-20220921143144-3535) Calling .DriverName
	I0921 14:35:44.681765    5339 driver.go:365] Setting default libvirt URI to qemu:///system
	I0921 14:35:44.682038    5339 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 14:35:44.682059    5339 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 14:35:44.687946    5339 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:50291
	I0921 14:35:44.688267    5339 main.go:134] libmachine: () Calling .GetVersion
	I0921 14:35:44.688549    5339 main.go:134] libmachine: Using API Version  1
	I0921 14:35:44.688559    5339 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 14:35:44.688742    5339 main.go:134] libmachine: () Calling .GetMachineName
	I0921 14:35:44.688862    5339 main.go:134] libmachine: (functional-20220921143144-3535) Calling .DriverName
	I0921 14:35:44.715782    5339 out.go:177] * Using the hyperkit driver based on existing profile
	I0921 14:35:44.757768    5339 start.go:284] selected driver: hyperkit
	I0921 14:35:44.757785    5339 start.go:808] validating driver "hyperkit" against &{Name:functional-20220921143144-3535 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube/iso/minikube-v1.27.0-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase:v0.0.34@sha256:f2a1e577e43fd6769f35cdb938f6d21c3dacfd763062d119cade738fa244720c Memory:4000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.25.2 Clust
erName:functional-20220921143144-3535 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8441 NodeName:} Nodes:[{Name: IP:192.168.64.4 Port:8441 KubernetesVersion:v1.25.2 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[ambassador:false auto-pause:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false istio:false istio-provisioner:false kong:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-driver-installer:false nvidia-gp
u-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false volumesnapshots:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath:/opt/socket_vmnet/bin/socket_vmnet_client SocketVMnetPath:/var/run/socket_vmnet}
	I0921 14:35:44.757943    5339 start.go:819] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0921 14:35:44.780825    5339 out.go:177] 
	W0921 14:35:44.801916    5339 out.go:239] X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	I0921 14:35:44.822964    5339 out.go:177] 

                                                
                                                
** /stderr **
functional_test.go:983: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-20220921143144-3535 --dry-run --alsologtostderr -v=1 --driver=hyperkit 
--- PASS: TestFunctional/parallel/DryRun (1.14s)

                                                
                                    
x
+
TestFunctional/parallel/InternationalLanguage (0.48s)

                                                
                                                
=== RUN   TestFunctional/parallel/InternationalLanguage
=== PAUSE TestFunctional/parallel/InternationalLanguage

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/InternationalLanguage
functional_test.go:1012: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-20220921143144-3535 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit 
functional_test.go:1012: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p functional-20220921143144-3535 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit : exit status 23 (476.736485ms)

                                                
                                                
-- stdout --
	* [functional-20220921143144-3535] minikube v1.27.0 sur Darwin 12.6
	  - MINIKUBE_LOCATION=14995
	  - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube
	* Utilisation du pilote hyperkit basé sur le profil existant
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0921 14:35:36.748821    5263 out.go:296] Setting OutFile to fd 1 ...
	I0921 14:35:36.748958    5263 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0921 14:35:36.748963    5263 out.go:309] Setting ErrFile to fd 2...
	I0921 14:35:36.748967    5263 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0921 14:35:36.749079    5263 root.go:333] Updating PATH: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/bin
	I0921 14:35:36.749495    5263 out.go:303] Setting JSON to false
	I0921 14:35:36.764830    5263 start.go:115] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":2107,"bootTime":1663794029,"procs":340,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"12.6","kernelVersion":"21.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0921 14:35:36.764927    5263 start.go:123] gopshost.Virtualization returned error: not implemented yet
	I0921 14:35:36.787175    5263 out.go:177] * [functional-20220921143144-3535] minikube v1.27.0 sur Darwin 12.6
	I0921 14:35:36.828704    5263 out.go:177]   - MINIKUBE_LOCATION=14995
	I0921 14:35:36.849875    5263 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/kubeconfig
	I0921 14:35:36.870865    5263 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0921 14:35:36.891765    5263 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0921 14:35:36.913157    5263 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube
	I0921 14:35:36.935711    5263 config.go:180] Loaded profile config "functional-20220921143144-3535": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.2
	I0921 14:35:36.936366    5263 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 14:35:36.936452    5263 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 14:35:36.943277    5263 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:50182
	I0921 14:35:36.943665    5263 main.go:134] libmachine: () Calling .GetVersion
	I0921 14:35:36.944089    5263 main.go:134] libmachine: Using API Version  1
	I0921 14:35:36.944100    5263 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 14:35:36.944305    5263 main.go:134] libmachine: () Calling .GetMachineName
	I0921 14:35:36.944395    5263 main.go:134] libmachine: (functional-20220921143144-3535) Calling .DriverName
	I0921 14:35:36.944511    5263 driver.go:365] Setting default libvirt URI to qemu:///system
	I0921 14:35:36.944759    5263 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 14:35:36.944779    5263 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 14:35:36.951082    5263 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:50184
	I0921 14:35:36.951419    5263 main.go:134] libmachine: () Calling .GetVersion
	I0921 14:35:36.951762    5263 main.go:134] libmachine: Using API Version  1
	I0921 14:35:36.951773    5263 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 14:35:36.951980    5263 main.go:134] libmachine: () Calling .GetMachineName
	I0921 14:35:36.952071    5263 main.go:134] libmachine: (functional-20220921143144-3535) Calling .DriverName
	I0921 14:35:36.978618    5263 out.go:177] * Utilisation du pilote hyperkit basé sur le profil existant
	I0921 14:35:37.021756    5263 start.go:284] selected driver: hyperkit
	I0921 14:35:37.021772    5263 start.go:808] validating driver "hyperkit" against &{Name:functional-20220921143144-3535 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube/iso/minikube-v1.27.0-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase:v0.0.34@sha256:f2a1e577e43fd6769f35cdb938f6d21c3dacfd763062d119cade738fa244720c Memory:4000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.25.2 Clust
erName:functional-20220921143144-3535 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8441 NodeName:} Nodes:[{Name: IP:192.168.64.4 Port:8441 KubernetesVersion:v1.25.2 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[ambassador:false auto-pause:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false istio:false istio-provisioner:false kong:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-driver-installer:false nvidia-gp
u-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false volumesnapshots:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath:/opt/socket_vmnet/bin/socket_vmnet_client SocketVMnetPath:/var/run/socket_vmnet}
	I0921 14:35:37.021873    5263 start.go:819] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0921 14:35:37.044659    5263 out.go:177] 
	W0921 14:35:37.066199    5263 out.go:239] X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	I0921 14:35:37.104031    5263 out.go:177] 

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/InternationalLanguage (0.48s)

                                                
                                    
x
+
TestFunctional/parallel/StatusCmd (0.44s)

                                                
                                                
=== RUN   TestFunctional/parallel/StatusCmd
=== PAUSE TestFunctional/parallel/StatusCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/StatusCmd
functional_test.go:846: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 status
functional_test.go:852: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}

                                                
                                                
=== CONT  TestFunctional/parallel/StatusCmd
functional_test.go:864: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 status -o json
--- PASS: TestFunctional/parallel/StatusCmd (0.44s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd (10.18s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd
=== PAUSE TestFunctional/parallel/ServiceCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmd
functional_test.go:1432: (dbg) Run:  kubectl --context functional-20220921143144-3535 create deployment hello-node --image=k8s.gcr.io/echoserver:1.8
functional_test.go:1438: (dbg) Run:  kubectl --context functional-20220921143144-3535 expose deployment hello-node --type=NodePort --port=8080
functional_test.go:1443: (dbg) TestFunctional/parallel/ServiceCmd: waiting 10m0s for pods matching "app=hello-node" in namespace "default" ...
helpers_test.go:342: "hello-node-5fcdfb5cc4-8z7zm" [f362d51e-9f06-470d-9080-6fdbf9e8a964] Pending / Ready:ContainersNotReady (containers with unready status: [echoserver]) / ContainersReady:ContainersNotReady (containers with unready status: [echoserver])
helpers_test.go:342: "hello-node-5fcdfb5cc4-8z7zm" [f362d51e-9f06-470d-9080-6fdbf9e8a964] Running
E0921 14:35:36.400981    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmd
functional_test.go:1443: (dbg) TestFunctional/parallel/ServiceCmd: app=hello-node healthy within 8.007123515s
functional_test.go:1448: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 service list
functional_test.go:1462: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 service --namespace=default --https --url hello-node
functional_test.go:1475: found endpoint: https://192.168.64.4:32596
functional_test.go:1490: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 service hello-node --url --format={{.IP}}
functional_test.go:1504: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 service hello-node --url
functional_test.go:1510: found endpoint for hello-node: http://192.168.64.4:32596
--- PASS: TestFunctional/parallel/ServiceCmd (10.18s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmdConnect (13.34s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmdConnect
=== PAUSE TestFunctional/parallel/ServiceCmdConnect

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmdConnect
functional_test.go:1558: (dbg) Run:  kubectl --context functional-20220921143144-3535 create deployment hello-node-connect --image=k8s.gcr.io/echoserver:1.8
functional_test.go:1564: (dbg) Run:  kubectl --context functional-20220921143144-3535 expose deployment hello-node-connect --type=NodePort --port=8080
functional_test.go:1569: (dbg) TestFunctional/parallel/ServiceCmdConnect: waiting 10m0s for pods matching "app=hello-node-connect" in namespace "default" ...
helpers_test.go:342: "hello-node-connect-6458c8fb6f-prvhx" [330b071e-2969-413b-aee6-e206324e7973] Pending / Ready:ContainersNotReady (containers with unready status: [echoserver]) / ContainersReady:ContainersNotReady (containers with unready status: [echoserver])

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmdConnect
helpers_test.go:342: "hello-node-connect-6458c8fb6f-prvhx" [330b071e-2969-413b-aee6-e206324e7973] Running

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmdConnect
functional_test.go:1569: (dbg) TestFunctional/parallel/ServiceCmdConnect: app=hello-node-connect healthy within 13.008349459s
functional_test.go:1578: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 service hello-node-connect --url
functional_test.go:1584: found endpoint for hello-node-connect: http://192.168.64.4:30179
functional_test.go:1604: http://192.168.64.4:30179: success! body:

                                                
                                                

                                                
                                                
Hostname: hello-node-connect-6458c8fb6f-prvhx

                                                
                                                
Pod Information:
	-no pod information available-

                                                
                                                
Server values:
	server_version=nginx: 1.13.3 - lua: 10008

                                                
                                                
Request Information:
	client_address=172.17.0.1
	method=GET
	real path=/
	query=
	request_version=1.1
	request_uri=http://192.168.64.4:8080/

                                                
                                                
Request Headers:
	accept-encoding=gzip
	host=192.168.64.4:30179
	user-agent=Go-http-client/1.1

                                                
                                                
Request Body:
	-no body in request-

                                                
                                                
--- PASS: TestFunctional/parallel/ServiceCmdConnect (13.34s)

                                                
                                    
x
+
TestFunctional/parallel/AddonsCmd (0.26s)

                                                
                                                
=== RUN   TestFunctional/parallel/AddonsCmd
=== PAUSE TestFunctional/parallel/AddonsCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/AddonsCmd
functional_test.go:1619: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 addons list
functional_test.go:1631: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 addons list -o json
--- PASS: TestFunctional/parallel/AddonsCmd (0.26s)

                                                
                                    
x
+
TestFunctional/parallel/PersistentVolumeClaim (27.03s)

                                                
                                                
=== RUN   TestFunctional/parallel/PersistentVolumeClaim
=== PAUSE TestFunctional/parallel/PersistentVolumeClaim

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:44: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "integration-test=storage-provisioner" in namespace "kube-system" ...
helpers_test.go:342: "storage-provisioner" [1586d963-17a2-4e59-aaf2-e2e7fb6b5c30] Running

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:44: (dbg) TestFunctional/parallel/PersistentVolumeClaim: integration-test=storage-provisioner healthy within 5.007491164s
functional_test_pvc_test.go:49: (dbg) Run:  kubectl --context functional-20220921143144-3535 get storageclass -o=json
functional_test_pvc_test.go:69: (dbg) Run:  kubectl --context functional-20220921143144-3535 apply -f testdata/storage-provisioner/pvc.yaml
functional_test_pvc_test.go:76: (dbg) Run:  kubectl --context functional-20220921143144-3535 get pvc myclaim -o=json
functional_test_pvc_test.go:125: (dbg) Run:  kubectl --context functional-20220921143144-3535 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 3m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:342: "sp-pod" [344f7613-fc92-43bd-844e-32c0ae480e45] Pending
helpers_test.go:342: "sp-pod" [344f7613-fc92-43bd-844e-32c0ae480e45] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
helpers_test.go:342: "sp-pod" [344f7613-fc92-43bd-844e-32c0ae480e45] Running

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 13.016529865s
functional_test_pvc_test.go:100: (dbg) Run:  kubectl --context functional-20220921143144-3535 exec sp-pod -- touch /tmp/mount/foo
functional_test_pvc_test.go:106: (dbg) Run:  kubectl --context functional-20220921143144-3535 delete -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:125: (dbg) Run:  kubectl --context functional-20220921143144-3535 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 3m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:342: "sp-pod" [19f22f03-4571-4ff6-926f-4e6e6a3f584d] Pending
helpers_test.go:342: "sp-pod" [19f22f03-4571-4ff6-926f-4e6e6a3f584d] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])
helpers_test.go:342: "sp-pod" [19f22f03-4571-4ff6-926f-4e6e6a3f584d] Running

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 8.012823785s
functional_test_pvc_test.go:114: (dbg) Run:  kubectl --context functional-20220921143144-3535 exec sp-pod -- ls /tmp/mount
--- PASS: TestFunctional/parallel/PersistentVolumeClaim (27.03s)

                                                
                                    
x
+
TestFunctional/parallel/SSHCmd (0.27s)

                                                
                                                
=== RUN   TestFunctional/parallel/SSHCmd
=== PAUSE TestFunctional/parallel/SSHCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/SSHCmd

                                                
                                                
=== CONT  TestFunctional/parallel/SSHCmd
functional_test.go:1654: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh "echo hello"

                                                
                                                
=== CONT  TestFunctional/parallel/SSHCmd
functional_test.go:1671: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh "cat /etc/hostname"
--- PASS: TestFunctional/parallel/SSHCmd (0.27s)

                                                
                                    
x
+
TestFunctional/parallel/CpCmd (0.58s)

                                                
                                                
=== RUN   TestFunctional/parallel/CpCmd
=== PAUSE TestFunctional/parallel/CpCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CpCmd
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 cp testdata/cp-test.txt /home/docker/cp-test.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh -n functional-20220921143144-3535 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 cp functional-20220921143144-3535:/home/docker/cp-test.txt /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelCpCmd1007906847/001/cp-test.txt

                                                
                                                
=== CONT  TestFunctional/parallel/CpCmd
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh -n functional-20220921143144-3535 "sudo cat /home/docker/cp-test.txt"
--- PASS: TestFunctional/parallel/CpCmd (0.58s)

                                                
                                    
x
+
TestFunctional/parallel/MySQL (19.95s)

                                                
                                                
=== RUN   TestFunctional/parallel/MySQL
=== PAUSE TestFunctional/parallel/MySQL

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/MySQL
functional_test.go:1719: (dbg) Run:  kubectl --context functional-20220921143144-3535 replace --force -f testdata/mysql.yaml
functional_test.go:1725: (dbg) TestFunctional/parallel/MySQL: waiting 10m0s for pods matching "app=mysql" in namespace "default" ...
helpers_test.go:342: "mysql-596b7fcdbf-fscd8" [e485a8ca-375a-4577-93af-4c3a04618e39] Pending / Ready:ContainersNotReady (containers with unready status: [mysql]) / ContainersReady:ContainersNotReady (containers with unready status: [mysql])

                                                
                                                
=== CONT  TestFunctional/parallel/MySQL
helpers_test.go:342: "mysql-596b7fcdbf-fscd8" [e485a8ca-375a-4577-93af-4c3a04618e39] Running

                                                
                                                
=== CONT  TestFunctional/parallel/MySQL
functional_test.go:1725: (dbg) TestFunctional/parallel/MySQL: app=mysql healthy within 18.012567539s
functional_test.go:1733: (dbg) Run:  kubectl --context functional-20220921143144-3535 exec mysql-596b7fcdbf-fscd8 -- mysql -ppassword -e "show databases;"
functional_test.go:1733: (dbg) Non-zero exit: kubectl --context functional-20220921143144-3535 exec mysql-596b7fcdbf-fscd8 -- mysql -ppassword -e "show databases;": exit status 1 (181.778713ms)

                                                
                                                
** stderr ** 
	mysql: [Warning] Using a password on the command line interface can be insecure.
	ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)
	command terminated with exit code 1

                                                
                                                
** /stderr **

                                                
                                                
=== CONT  TestFunctional/parallel/MySQL
functional_test.go:1733: (dbg) Run:  kubectl --context functional-20220921143144-3535 exec mysql-596b7fcdbf-fscd8 -- mysql -ppassword -e "show databases;"
--- PASS: TestFunctional/parallel/MySQL (19.95s)

                                                
                                    
x
+
TestFunctional/parallel/FileSync (0.14s)

                                                
                                                
=== RUN   TestFunctional/parallel/FileSync
=== PAUSE TestFunctional/parallel/FileSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/FileSync
functional_test.go:1855: Checking for existence of /etc/test/nested/copy/3535/hosts within VM
functional_test.go:1857: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh "sudo cat /etc/test/nested/copy/3535/hosts"
functional_test.go:1862: file sync test content: Test file for checking file sync process
--- PASS: TestFunctional/parallel/FileSync (0.14s)

                                                
                                    
x
+
TestFunctional/parallel/CertSync (0.83s)

                                                
                                                
=== RUN   TestFunctional/parallel/CertSync
=== PAUSE TestFunctional/parallel/CertSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CertSync
functional_test.go:1898: Checking for existence of /etc/ssl/certs/3535.pem within VM
functional_test.go:1899: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh "sudo cat /etc/ssl/certs/3535.pem"
functional_test.go:1898: Checking for existence of /usr/share/ca-certificates/3535.pem within VM
functional_test.go:1899: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh "sudo cat /usr/share/ca-certificates/3535.pem"
functional_test.go:1898: Checking for existence of /etc/ssl/certs/51391683.0 within VM
functional_test.go:1899: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh "sudo cat /etc/ssl/certs/51391683.0"
functional_test.go:1925: Checking for existence of /etc/ssl/certs/35352.pem within VM
functional_test.go:1926: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh "sudo cat /etc/ssl/certs/35352.pem"
functional_test.go:1925: Checking for existence of /usr/share/ca-certificates/35352.pem within VM
functional_test.go:1926: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh "sudo cat /usr/share/ca-certificates/35352.pem"
functional_test.go:1925: Checking for existence of /etc/ssl/certs/3ec20f2e.0 within VM
functional_test.go:1926: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh "sudo cat /etc/ssl/certs/3ec20f2e.0"
--- PASS: TestFunctional/parallel/CertSync (0.83s)

                                                
                                    
x
+
TestFunctional/parallel/NodeLabels (0.05s)

                                                
                                                
=== RUN   TestFunctional/parallel/NodeLabels
=== PAUSE TestFunctional/parallel/NodeLabels

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NodeLabels
functional_test.go:214: (dbg) Run:  kubectl --context functional-20220921143144-3535 get nodes --output=go-template "--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'"
--- PASS: TestFunctional/parallel/NodeLabels (0.05s)

                                                
                                    
x
+
TestFunctional/parallel/NonActiveRuntimeDisabled (0.17s)

                                                
                                                
=== RUN   TestFunctional/parallel/NonActiveRuntimeDisabled
=== PAUSE TestFunctional/parallel/NonActiveRuntimeDisabled

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NonActiveRuntimeDisabled
functional_test.go:1953: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh "sudo systemctl is-active crio"

                                                
                                                
=== CONT  TestFunctional/parallel/NonActiveRuntimeDisabled
functional_test.go:1953: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh "sudo systemctl is-active crio": exit status 1 (170.819385ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/NonActiveRuntimeDisabled (0.17s)

                                                
                                    
x
+
TestFunctional/parallel/Version/short (0.09s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/short
=== PAUSE TestFunctional/parallel/Version/short

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/short
functional_test.go:2182: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 version --short
--- PASS: TestFunctional/parallel/Version/short (0.09s)

                                                
                                    
x
+
TestFunctional/parallel/Version/components (0.35s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/components
=== PAUSE TestFunctional/parallel/Version/components

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/components
functional_test.go:2196: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 version -o=json --components
--- PASS: TestFunctional/parallel/Version/components (0.35s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListShort (0.16s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListShort
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListShort

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListShort
functional_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 image ls --format short
functional_test.go:261: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-20220921143144-3535 image ls --format short:
registry.k8s.io/pause:3.8
registry.k8s.io/kube-scheduler:v1.25.2
registry.k8s.io/kube-proxy:v1.25.2
registry.k8s.io/kube-controller-manager:v1.25.2
registry.k8s.io/kube-apiserver:v1.25.2
registry.k8s.io/etcd:3.5.4-0
registry.k8s.io/coredns/coredns:v1.9.3
k8s.gcr.io/pause:latest
k8s.gcr.io/pause:3.6
k8s.gcr.io/pause:3.3
k8s.gcr.io/pause:3.1
k8s.gcr.io/echoserver:1.8
gcr.io/k8s-minikube/storage-provisioner:v5
gcr.io/k8s-minikube/busybox:1.28.4-glibc
gcr.io/google-containers/addon-resizer:functional-20220921143144-3535
docker.io/library/nginx:latest
docker.io/library/nginx:alpine
docker.io/library/mysql:5.7
docker.io/library/minikube-local-cache-test:functional-20220921143144-3535
--- PASS: TestFunctional/parallel/ImageCommands/ImageListShort (0.16s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListTable (0.15s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListTable
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListTable

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListTable
functional_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 image ls --format table
functional_test.go:261: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-20220921143144-3535 image ls --format table:
|---------------------------------------------|--------------------------------|---------------|--------|
|                    Image                    |              Tag               |   Image ID    |  Size  |
|---------------------------------------------|--------------------------------|---------------|--------|
| docker.io/localhost/my-image                | functional-20220921143144-3535 | 381c15f316a9c | 1.24MB |
| registry.k8s.io/kube-scheduler              | v1.25.2                        | ca0ea1ee3cfd3 | 50.6MB |
| registry.k8s.io/kube-proxy                  | v1.25.2                        | 1c7d8c51823b5 | 61.7MB |
| docker.io/library/mysql                     | 5.7                            | daff57b7d2d1e | 430MB  |
| registry.k8s.io/etcd                        | 3.5.4-0                        | a8a176a5d5d69 | 300MB  |
| k8s.gcr.io/pause                            | 3.3                            | 0184c1613d929 | 683kB  |
| registry.k8s.io/pause                       | 3.8                            | 4873874c08efc | 711kB  |
| docker.io/kubernetesui/dashboard            | <none>                         | 1042d9e0d8fcc | 246MB  |
| registry.k8s.io/coredns/coredns             | v1.9.3                         | 5185b96f0becf | 48.8MB |
| gcr.io/k8s-minikube/busybox                 | latest                         | beae173ccac6a | 1.24MB |
| k8s.gcr.io/pause                            | 3.6                            | 6270bb605e12e | 683kB  |
| k8s.gcr.io/echoserver                       | 1.8                            | 82e4c8a736a4f | 95.4MB |
| docker.io/library/nginx                     | latest                         | 2d389e545974d | 142MB  |
| gcr.io/google-containers/addon-resizer      | functional-20220921143144-3535 | ffd4cfbbe753e | 32.9MB |
| gcr.io/k8s-minikube/busybox                 | 1.28.4-glibc                   | 56cc512116c8f | 4.4MB  |
| k8s.gcr.io/pause                            | latest                         | 350b164e7ae1d | 240kB  |
| docker.io/library/minikube-local-cache-test | functional-20220921143144-3535 | 396de3ea5652c | 30B    |
| registry.k8s.io/kube-apiserver              | v1.25.2                        | 97801f8394908 | 128MB  |
| registry.k8s.io/kube-controller-manager     | v1.25.2                        | dbfceb93c69b6 | 117MB  |
| docker.io/library/nginx                     | alpine                         | 804f9cebfdc58 | 23.5MB |
| docker.io/kubernetesui/metrics-scraper      | <none>                         | 115053965e86b | 43.8MB |
| gcr.io/k8s-minikube/storage-provisioner     | v5                             | 6e38f40d628db | 31.5MB |
| k8s.gcr.io/pause                            | 3.1                            | da86e6ba6ca19 | 742kB  |
|---------------------------------------------|--------------------------------|---------------|--------|
2022/09/21 14:35:58 [DEBUG] GET http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/
--- PASS: TestFunctional/parallel/ImageCommands/ImageListTable (0.15s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListJson (0.19s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListJson
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListJson

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListJson
functional_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 image ls --format json
functional_test.go:261: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-20220921143144-3535 image ls --format json:
[{"id":"beae173ccac6ad749f76713cf4440fe3d21d1043fe616dfbe30775815d1d0f6a","repoDigests":[],"repoTags":["gcr.io/k8s-minikube/busybox:latest"],"size":"1240000"},{"id":"6e38f40d628db3002f5617342c8872c935de530d867d0f709a2fbda1a302a562","repoDigests":[],"repoTags":["gcr.io/k8s-minikube/storage-provisioner:v5"],"size":"31500000"},{"id":"ffd4cfbbe753e62419e129ee2ac618beb94e51baa7471df5038b0b516b59cf91","repoDigests":[],"repoTags":["gcr.io/google-containers/addon-resizer:functional-20220921143144-3535"],"size":"32900000"},{"id":"ca0ea1ee3cfd3d1ced15a8e6f4a236a436c5733b20a0b2dbbfbfd59977e12959","repoDigests":[],"repoTags":["registry.k8s.io/kube-scheduler:v1.25.2"],"size":"50600000"},{"id":"daff57b7d2d1e009d0b271972f62dbf4de64b8cdb9cd646442aeda961e615f44","repoDigests":[],"repoTags":["docker.io/library/mysql:5.7"],"size":"430000000"},{"id":"804f9cebfdc58964d6b25527e53802a3527a9ee880e082dc5b19a3d5466c43b7","repoDigests":[],"repoTags":["docker.io/library/nginx:alpine"],"size":"23500000"},{"id":"6270bb605e12e58151
4ada5fd5b3216f727db55dc87d5889c790e4c760683fee","repoDigests":[],"repoTags":["k8s.gcr.io/pause:3.6"],"size":"683000"},{"id":"0184c1613d92931126feb4c548e5da11015513b9e4c104e7305ee8b53b50a9da","repoDigests":[],"repoTags":["k8s.gcr.io/pause:3.3"],"size":"683000"},{"id":"dbfceb93c69b6d85661fe46c3e50de9e927e4895ebba2892a1db116e69c81890","repoDigests":[],"repoTags":["registry.k8s.io/kube-controller-manager:v1.25.2"],"size":"117000000"},{"id":"a8a176a5d5d698f9409dc246f81fa69d37d4a2f4132ba5e62e72a78476b27f66","repoDigests":[],"repoTags":["registry.k8s.io/etcd:3.5.4-0"],"size":"300000000"},{"id":"115053965e86b2df4d78af78d7951b8644839d20a03820c6df59a261103315f7","repoDigests":[],"repoTags":["docker.io/kubernetesui/metrics-scraper:\u003cnone\u003e"],"size":"43800000"},{"id":"4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.8"],"size":"711000"},{"id":"1042d9e0d8fcc64f2c6b9ade3af9e8ed255fa04d18d838d0b3650ad7636534a9","repoDigests":[],"repoTags":["docke
r.io/kubernetesui/dashboard:\u003cnone\u003e"],"size":"246000000"},{"id":"5185b96f0becf59032b8e3646e99f84d9655dff3ac9e2605e0dc77f9c441ae4a","repoDigests":[],"repoTags":["registry.k8s.io/coredns/coredns:v1.9.3"],"size":"48800000"},{"id":"56cc512116c8f894f11ce1995460aef1ee0972d48bc2a3bdb1faaac7c020289c","repoDigests":[],"repoTags":["gcr.io/k8s-minikube/busybox:1.28.4-glibc"],"size":"4400000"},{"id":"da86e6ba6ca197bf6bc5e9d900febd906b133eaa4750e6bed647b0fbe50ed43e","repoDigests":[],"repoTags":["k8s.gcr.io/pause:3.1"],"size":"742000"},{"id":"381c15f316a9c20793ff67d86a3a86164f38ef544263b307df1062cbbe44c29d","repoDigests":[],"repoTags":["docker.io/localhost/my-image:functional-20220921143144-3535"],"size":"1240000"},{"id":"396de3ea5652cfe895d2486124bde3f37ca5caa610f35fe76f92f7a83190c2d0","repoDigests":[],"repoTags":["docker.io/library/minikube-local-cache-test:functional-20220921143144-3535"],"size":"30"},{"id":"1c7d8c51823b5eb08189d553d911097ec8a6a40fea40bb5bdea91842f30d2e86","repoDigests":[],"repoTags":["registry
.k8s.io/kube-proxy:v1.25.2"],"size":"61700000"},{"id":"350b164e7ae1dcddeffadd65c76226c9b6dc5553f5179153fb0e36b78f2a5e06","repoDigests":[],"repoTags":["k8s.gcr.io/pause:latest"],"size":"240000"},{"id":"97801f83949087fbdcc09b1c84ddda0ed5d01f4aabd17787a7714eb2796082b3","repoDigests":[],"repoTags":["registry.k8s.io/kube-apiserver:v1.25.2"],"size":"128000000"},{"id":"2d389e545974d4a93ebdef09b650753a55f72d1ab4518d17a30c0e1b3e297444","repoDigests":[],"repoTags":["docker.io/library/nginx:latest"],"size":"142000000"},{"id":"82e4c8a736a4fcf22b5ef9f6a4ff6207064c7187d7694bf97bd561605a538410","repoDigests":[],"repoTags":["k8s.gcr.io/echoserver:1.8"],"size":"95400000"}]
--- PASS: TestFunctional/parallel/ImageCommands/ImageListJson (0.19s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListYaml (0.16s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListYaml
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListYaml

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListYaml
functional_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 image ls --format yaml
functional_test.go:261: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-20220921143144-3535 image ls --format yaml:
- id: 2d389e545974d4a93ebdef09b650753a55f72d1ab4518d17a30c0e1b3e297444
repoDigests: []
repoTags:
- docker.io/library/nginx:latest
size: "142000000"
- id: 804f9cebfdc58964d6b25527e53802a3527a9ee880e082dc5b19a3d5466c43b7
repoDigests: []
repoTags:
- docker.io/library/nginx:alpine
size: "23500000"
- id: 4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.8
size: "711000"
- id: a8a176a5d5d698f9409dc246f81fa69d37d4a2f4132ba5e62e72a78476b27f66
repoDigests: []
repoTags:
- registry.k8s.io/etcd:3.5.4-0
size: "300000000"
- id: 396de3ea5652cfe895d2486124bde3f37ca5caa610f35fe76f92f7a83190c2d0
repoDigests: []
repoTags:
- docker.io/library/minikube-local-cache-test:functional-20220921143144-3535
size: "30"
- id: dbfceb93c69b6d85661fe46c3e50de9e927e4895ebba2892a1db116e69c81890
repoDigests: []
repoTags:
- registry.k8s.io/kube-controller-manager:v1.25.2
size: "117000000"
- id: 56cc512116c8f894f11ce1995460aef1ee0972d48bc2a3bdb1faaac7c020289c
repoDigests: []
repoTags:
- gcr.io/k8s-minikube/busybox:1.28.4-glibc
size: "4400000"
- id: 6e38f40d628db3002f5617342c8872c935de530d867d0f709a2fbda1a302a562
repoDigests: []
repoTags:
- gcr.io/k8s-minikube/storage-provisioner:v5
size: "31500000"
- id: ffd4cfbbe753e62419e129ee2ac618beb94e51baa7471df5038b0b516b59cf91
repoDigests: []
repoTags:
- gcr.io/google-containers/addon-resizer:functional-20220921143144-3535
size: "32900000"
- id: 5185b96f0becf59032b8e3646e99f84d9655dff3ac9e2605e0dc77f9c441ae4a
repoDigests: []
repoTags:
- registry.k8s.io/coredns/coredns:v1.9.3
size: "48800000"
- id: 0184c1613d92931126feb4c548e5da11015513b9e4c104e7305ee8b53b50a9da
repoDigests: []
repoTags:
- k8s.gcr.io/pause:3.3
size: "683000"
- id: da86e6ba6ca197bf6bc5e9d900febd906b133eaa4750e6bed647b0fbe50ed43e
repoDigests: []
repoTags:
- k8s.gcr.io/pause:3.1
size: "742000"
- id: 97801f83949087fbdcc09b1c84ddda0ed5d01f4aabd17787a7714eb2796082b3
repoDigests: []
repoTags:
- registry.k8s.io/kube-apiserver:v1.25.2
size: "128000000"
- id: ca0ea1ee3cfd3d1ced15a8e6f4a236a436c5733b20a0b2dbbfbfd59977e12959
repoDigests: []
repoTags:
- registry.k8s.io/kube-scheduler:v1.25.2
size: "50600000"
- id: 6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee
repoDigests: []
repoTags:
- k8s.gcr.io/pause:3.6
size: "683000"
- id: 82e4c8a736a4fcf22b5ef9f6a4ff6207064c7187d7694bf97bd561605a538410
repoDigests: []
repoTags:
- k8s.gcr.io/echoserver:1.8
size: "95400000"
- id: 350b164e7ae1dcddeffadd65c76226c9b6dc5553f5179153fb0e36b78f2a5e06
repoDigests: []
repoTags:
- k8s.gcr.io/pause:latest
size: "240000"
- id: 1c7d8c51823b5eb08189d553d911097ec8a6a40fea40bb5bdea91842f30d2e86
repoDigests: []
repoTags:
- registry.k8s.io/kube-proxy:v1.25.2
size: "61700000"
- id: daff57b7d2d1e009d0b271972f62dbf4de64b8cdb9cd646442aeda961e615f44
repoDigests: []
repoTags:
- docker.io/library/mysql:5.7
size: "430000000"

                                                
                                                
--- PASS: TestFunctional/parallel/ImageCommands/ImageListYaml (0.16s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageBuild (5.71s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageBuild
=== PAUSE TestFunctional/parallel/ImageCommands/ImageBuild

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageBuild
functional_test.go:303: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh pgrep buildkitd
functional_test.go:303: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh pgrep buildkitd: exit status 1 (115.67998ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:310: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 image build -t localhost/my-image:functional-20220921143144-3535 testdata/build
functional_test.go:310: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220921143144-3535 image build -t localhost/my-image:functional-20220921143144-3535 testdata/build: (5.447319574s)
functional_test.go:315: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-20220921143144-3535 image build -t localhost/my-image:functional-20220921143144-3535 testdata/build:
Sending build context to Docker daemon  3.072kB

Step 1/3 : FROM gcr.io/k8s-minikube/busybox
latest: Pulling from k8s-minikube/busybox
5cc84ad355aa: Pulling fs layer
5cc84ad355aa: Verifying Checksum
5cc84ad355aa: Download complete
5cc84ad355aa: Pull complete
Digest: sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
Status: Downloaded newer image for gcr.io/k8s-minikube/busybox:latest
---> beae173ccac6
Step 2/3 : RUN true
---> Running in 21526819afc0
Removing intermediate container 21526819afc0
---> 38e403e3c830
Step 3/3 : ADD content.txt /
---> 381c15f316a9
Successfully built 381c15f316a9
Successfully tagged localhost/my-image:functional-20220921143144-3535
functional_test.go:443: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageBuild (5.71s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/Setup (4.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/Setup
functional_test.go:337: (dbg) Run:  docker pull gcr.io/google-containers/addon-resizer:1.8.8

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/Setup
functional_test.go:337: (dbg) Done: docker pull gcr.io/google-containers/addon-resizer:1.8.8: (3.952427599s)
functional_test.go:342: (dbg) Run:  docker tag gcr.io/google-containers/addon-resizer:1.8.8 gcr.io/google-containers/addon-resizer:functional-20220921143144-3535
--- PASS: TestFunctional/parallel/ImageCommands/Setup (4.02s)

                                                
                                    
x
+
TestFunctional/parallel/DockerEnv/bash (0.63s)

                                                
                                                
=== RUN   TestFunctional/parallel/DockerEnv/bash
functional_test.go:491: (dbg) Run:  /bin/bash -c "eval $(out/minikube-darwin-amd64 -p functional-20220921143144-3535 docker-env) && out/minikube-darwin-amd64 status -p functional-20220921143144-3535"
functional_test.go:514: (dbg) Run:  /bin/bash -c "eval $(out/minikube-darwin-amd64 -p functional-20220921143144-3535 docker-env) && docker images"
--- PASS: TestFunctional/parallel/DockerEnv/bash (0.63s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_changes (0.19s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_changes
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_changes

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_changes
functional_test.go:2045: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_changes (0.19s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.18s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
functional_test.go:2045: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.18s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_clusters (0.18s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_clusters
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_clusters

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_clusters
functional_test.go:2045: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_clusters (0.18s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadDaemon (3.7s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadDaemon
functional_test.go:350: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 image load --daemon gcr.io/google-containers/addon-resizer:functional-20220921143144-3535
functional_test.go:350: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220921143144-3535 image load --daemon gcr.io/google-containers/addon-resizer:functional-20220921143144-3535: (3.540789525s)
functional_test.go:443: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadDaemon (3.70s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageReloadDaemon (2.04s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageReloadDaemon
functional_test.go:360: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 image load --daemon gcr.io/google-containers/addon-resizer:functional-20220921143144-3535
E0921 14:34:55.438514    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
functional_test.go:360: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220921143144-3535 image load --daemon gcr.io/google-containers/addon-resizer:functional-20220921143144-3535: (1.876409387s)
functional_test.go:443: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageReloadDaemon (2.04s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (6.65s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon
functional_test.go:230: (dbg) Run:  docker pull gcr.io/google-containers/addon-resizer:1.8.9
functional_test.go:230: (dbg) Done: docker pull gcr.io/google-containers/addon-resizer:1.8.9: (3.980232702s)
functional_test.go:235: (dbg) Run:  docker tag gcr.io/google-containers/addon-resizer:1.8.9 gcr.io/google-containers/addon-resizer:functional-20220921143144-3535
functional_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 image load --daemon gcr.io/google-containers/addon-resizer:functional-20220921143144-3535

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon
functional_test.go:240: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220921143144-3535 image load --daemon gcr.io/google-containers/addon-resizer:functional-20220921143144-3535: (2.459019089s)
functional_test.go:443: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (6.65s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveToFile (0.83s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveToFile
functional_test.go:375: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 image save gcr.io/google-containers/addon-resizer:functional-20220921143144-3535 /Users/jenkins/workspace/addon-resizer-save.tar
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveToFile (0.83s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageRemove (0.32s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageRemove
functional_test.go:387: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 image rm gcr.io/google-containers/addon-resizer:functional-20220921143144-3535
functional_test.go:443: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageRemove (0.32s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadFromFile (1.44s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadFromFile
functional_test.go:404: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 image load /Users/jenkins/workspace/addon-resizer-save.tar
functional_test.go:404: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220921143144-3535 image load /Users/jenkins/workspace/addon-resizer-save.tar: (1.285130163s)
functional_test.go:443: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadFromFile (1.44s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveDaemon (2.26s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveDaemon
functional_test.go:414: (dbg) Run:  docker rmi gcr.io/google-containers/addon-resizer:functional-20220921143144-3535
functional_test.go:419: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 image save --daemon gcr.io/google-containers/addon-resizer:functional-20220921143144-3535

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageSaveDaemon
functional_test.go:419: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220921143144-3535 image save --daemon gcr.io/google-containers/addon-resizer:functional-20220921143144-3535: (2.134090743s)
functional_test.go:424: (dbg) Run:  docker image inspect gcr.io/google-containers/addon-resizer:functional-20220921143144-3535
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveDaemon (2.26s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_not_create (0.41s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_not_create
functional_test.go:1265: (dbg) Run:  out/minikube-darwin-amd64 profile lis

                                                
                                                
=== CONT  TestFunctional/parallel/ProfileCmd/profile_not_create
functional_test.go:1270: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestFunctional/parallel/ProfileCmd/profile_not_create (0.41s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/StartTunnel
functional_test_tunnel_test.go:127: (dbg) daemon: [out/minikube-darwin-amd64 -p functional-20220921143144-3535 tunnel --alsologtostderr]
--- PASS: TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (11.12s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup
functional_test_tunnel_test.go:147: (dbg) Run:  kubectl --context functional-20220921143144-3535 apply -f testdata/testsvc.yaml
functional_test_tunnel_test.go:151: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: waiting 4m0s for pods matching "run=nginx-svc" in namespace "default" ...
helpers_test.go:342: "nginx-svc" [ce020ffe-249f-486c-872e-ed37e4de93e9] Pending

                                                
                                                
=== CONT  TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup
helpers_test.go:342: "nginx-svc" [ce020ffe-249f-486c-872e-ed37e4de93e9] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:342: "nginx-svc" [ce020ffe-249f-486c-872e-ed37e4de93e9] Running

                                                
                                                
=== CONT  TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup
functional_test_tunnel_test.go:151: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: run=nginx-svc healthy within 11.005094079s
--- PASS: TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (11.12s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_list (0.29s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_list
functional_test.go:1305: (dbg) Run:  out/minikube-darwin-amd64 profile list
functional_test.go:1310: Took "204.508453ms" to run "out/minikube-darwin-amd64 profile list"
functional_test.go:1319: (dbg) Run:  out/minikube-darwin-amd64 profile list -l
functional_test.go:1324: Took "85.478188ms" to run "out/minikube-darwin-amd64 profile list -l"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_list (0.29s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_json_output (0.37s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_json_output
functional_test.go:1356: (dbg) Run:  out/minikube-darwin-amd64 profile list -o json
functional_test.go:1361: Took "255.549464ms" to run "out/minikube-darwin-amd64 profile list -o json"
functional_test.go:1369: (dbg) Run:  out/minikube-darwin-amd64 profile list -o json --light
functional_test.go:1374: Took "115.59868ms" to run "out/minikube-darwin-amd64 profile list -o json --light"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_json_output (0.37s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP (0.04s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP
functional_test_tunnel_test.go:169: (dbg) Run:  kubectl --context functional-20220921143144-3535 get svc nginx-svc -o jsonpath={.status.loadBalancer.ingress[0].ip}
--- PASS: TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP (0.04s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0.01s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessDirect
functional_test_tunnel_test.go:234: tunnel at http://10.99.222.19 is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0.01s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0.03s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig
functional_test_tunnel_test.go:254: (dbg) Run:  dig +time=5 +tries=3 @10.96.0.10 nginx-svc.default.svc.cluster.local. A
functional_test_tunnel_test.go:262: DNS resolution by dig for nginx-svc.default.svc.cluster.local. is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0.03s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil
functional_test_tunnel_test.go:286: (dbg) Run:  dscacheutil -q host -a name nginx-svc.default.svc.cluster.local.
functional_test_tunnel_test.go:294: DNS resolution by dscacheutil for nginx-svc.default.svc.cluster.local. is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0.05s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS
functional_test_tunnel_test.go:359: tunnel at http://nginx-svc.default.svc.cluster.local. is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0.05s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.13s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel
functional_test_tunnel_test.go:369: (dbg) stopping [out/minikube-darwin-amd64 -p functional-20220921143144-3535 tunnel --alsologtostderr] ...
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.13s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/any-port (9.85s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/any-port
functional_test_mount_test.go:66: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-20220921143144-3535 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdany-port1595533174/001:/mount-9p --alsologtostderr -v=1]
functional_test_mount_test.go:100: wrote "test-1663796137184907000" to /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdany-port1595533174/001/created-by-test
functional_test_mount_test.go:100: wrote "test-1663796137184907000" to /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdany-port1595533174/001/created-by-test-removed-by-pod
functional_test_mount_test.go:100: wrote "test-1663796137184907000" to /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdany-port1595533174/001/test-1663796137184907000
functional_test_mount_test.go:108: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:108: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (113.702532ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:108: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh -- ls -la /mount-9p
functional_test_mount_test.go:126: guest mount directory contents
total 2
-rw-r--r-- 1 docker docker 24 Sep 21 21:35 created-by-test
-rw-r--r-- 1 docker docker 24 Sep 21 21:35 created-by-test-removed-by-pod
-rw-r--r-- 1 docker docker 24 Sep 21 21:35 test-1663796137184907000
functional_test_mount_test.go:130: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh cat /mount-9p/test-1663796137184907000
functional_test_mount_test.go:141: (dbg) Run:  kubectl --context functional-20220921143144-3535 replace --force -f testdata/busybox-mount-test.yaml
functional_test_mount_test.go:146: (dbg) TestFunctional/parallel/MountCmd/any-port: waiting 4m0s for pods matching "integration-test=busybox-mount" in namespace "default" ...
helpers_test.go:342: "busybox-mount" [c6f780aa-7129-4543-8ec5-5ece71f36fab] Pending
helpers_test.go:342: "busybox-mount" [c6f780aa-7129-4543-8ec5-5ece71f36fab] Pending / Ready:ContainersNotReady (containers with unready status: [mount-munger]) / ContainersReady:ContainersNotReady (containers with unready status: [mount-munger])

                                                
                                                
=== CONT  TestFunctional/parallel/MountCmd/any-port
helpers_test.go:342: "busybox-mount" [c6f780aa-7129-4543-8ec5-5ece71f36fab] Pending: Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted

                                                
                                                
=== CONT  TestFunctional/parallel/MountCmd/any-port
helpers_test.go:342: "busybox-mount" [c6f780aa-7129-4543-8ec5-5ece71f36fab] Succeeded: Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
functional_test_mount_test.go:146: (dbg) TestFunctional/parallel/MountCmd/any-port: integration-test=busybox-mount healthy within 8.016559015s
functional_test_mount_test.go:162: (dbg) Run:  kubectl --context functional-20220921143144-3535 logs busybox-mount
functional_test_mount_test.go:174: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh stat /mount-9p/created-by-test
functional_test_mount_test.go:174: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh stat /mount-9p/created-by-pod
functional_test_mount_test.go:83: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:87: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-20220921143144-3535 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdany-port1595533174/001:/mount-9p --alsologtostderr -v=1] ...
--- PASS: TestFunctional/parallel/MountCmd/any-port (9.85s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/specific-port (1.44s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/specific-port
functional_test_mount_test.go:206: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-20220921143144-3535 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdspecific-port2384138517/001:/mount-9p --alsologtostderr -v=1 --port 46464]
functional_test_mount_test.go:236: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:236: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (154.479515ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:236: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:250: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh -- ls -la /mount-9p
functional_test_mount_test.go:254: guest mount directory contents
total 0
functional_test_mount_test.go:256: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-20220921143144-3535 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdspecific-port2384138517/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
functional_test_mount_test.go:257: reading mount text
functional_test_mount_test.go:271: done reading mount text
functional_test_mount_test.go:223: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:223: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh "sudo umount -f /mount-9p": exit status 1 (116.999376ms)

                                                
                                                
-- stdout --
	umount: /mount-9p: not mounted.

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 32

                                                
                                                
** /stderr **
functional_test_mount_test.go:225: "out/minikube-darwin-amd64 -p functional-20220921143144-3535 ssh \"sudo umount -f /mount-9p\"": exit status 1
functional_test_mount_test.go:227: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-20220921143144-3535 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdspecific-port2384138517/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
--- PASS: TestFunctional/parallel/MountCmd/specific-port (1.44s)

                                                
                                    
x
+
TestFunctional/delete_addon-resizer_images (0.16s)

                                                
                                                
=== RUN   TestFunctional/delete_addon-resizer_images
functional_test.go:185: (dbg) Run:  docker rmi -f gcr.io/google-containers/addon-resizer:1.8.8
functional_test.go:185: (dbg) Run:  docker rmi -f gcr.io/google-containers/addon-resizer:functional-20220921143144-3535
--- PASS: TestFunctional/delete_addon-resizer_images (0.16s)

                                                
                                    
x
+
TestFunctional/delete_my-image_image (0.08s)

                                                
                                                
=== RUN   TestFunctional/delete_my-image_image
functional_test.go:193: (dbg) Run:  docker rmi -f localhost/my-image:functional-20220921143144-3535
--- PASS: TestFunctional/delete_my-image_image (0.08s)

                                                
                                    
x
+
TestFunctional/delete_minikube_cached_images (0.06s)

                                                
                                                
=== RUN   TestFunctional/delete_minikube_cached_images
functional_test.go:201: (dbg) Run:  docker rmi -f minikube-local-cache-test:functional-20220921143144-3535
--- PASS: TestFunctional/delete_minikube_cached_images (0.06s)

                                                
                                    
x
+
TestIngressAddonLegacy/StartLegacyK8sCluster (75.63s)

                                                
                                                
=== RUN   TestIngressAddonLegacy/StartLegacyK8sCluster
ingress_addon_legacy_test.go:39: (dbg) Run:  out/minikube-darwin-amd64 start -p ingress-addon-legacy-20220921143604-3535 --kubernetes-version=v1.18.20 --memory=4096 --wait=true --alsologtostderr -v=5 --driver=hyperkit 
E0921 14:36:58.322684    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
ingress_addon_legacy_test.go:39: (dbg) Done: out/minikube-darwin-amd64 start -p ingress-addon-legacy-20220921143604-3535 --kubernetes-version=v1.18.20 --memory=4096 --wait=true --alsologtostderr -v=5 --driver=hyperkit : (1m15.631801666s)
--- PASS: TestIngressAddonLegacy/StartLegacyK8sCluster (75.63s)

                                                
                                    
x
+
TestIngressAddonLegacy/serial/ValidateIngressAddonActivation (16.22s)

                                                
                                                
=== RUN   TestIngressAddonLegacy/serial/ValidateIngressAddonActivation
ingress_addon_legacy_test.go:70: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-20220921143604-3535 addons enable ingress --alsologtostderr -v=5
ingress_addon_legacy_test.go:70: (dbg) Done: out/minikube-darwin-amd64 -p ingress-addon-legacy-20220921143604-3535 addons enable ingress --alsologtostderr -v=5: (16.221661094s)
--- PASS: TestIngressAddonLegacy/serial/ValidateIngressAddonActivation (16.22s)

                                                
                                    
x
+
TestIngressAddonLegacy/serial/ValidateIngressDNSAddonActivation (0.54s)

                                                
                                                
=== RUN   TestIngressAddonLegacy/serial/ValidateIngressDNSAddonActivation
ingress_addon_legacy_test.go:79: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-20220921143604-3535 addons enable ingress-dns --alsologtostderr -v=5
--- PASS: TestIngressAddonLegacy/serial/ValidateIngressDNSAddonActivation (0.54s)

                                                
                                    
x
+
TestIngressAddonLegacy/serial/ValidateIngressAddons (29.56s)

                                                
                                                
=== RUN   TestIngressAddonLegacy/serial/ValidateIngressAddons
addons_test.go:164: (dbg) Run:  kubectl --context ingress-addon-legacy-20220921143604-3535 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s
addons_test.go:164: (dbg) Done: kubectl --context ingress-addon-legacy-20220921143604-3535 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s: (9.839970514s)
addons_test.go:184: (dbg) Run:  kubectl --context ingress-addon-legacy-20220921143604-3535 replace --force -f testdata/nginx-ingress-v1beta1.yaml
addons_test.go:197: (dbg) Run:  kubectl --context ingress-addon-legacy-20220921143604-3535 replace --force -f testdata/nginx-pod-svc.yaml
addons_test.go:202: (dbg) TestIngressAddonLegacy/serial/ValidateIngressAddons: waiting 8m0s for pods matching "run=nginx" in namespace "default" ...
helpers_test.go:342: "nginx" [c6e5b9f3-4906-4b8e-82da-8736ef16b870] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:342: "nginx" [c6e5b9f3-4906-4b8e-82da-8736ef16b870] Running
addons_test.go:202: (dbg) TestIngressAddonLegacy/serial/ValidateIngressAddons: run=nginx healthy within 9.016515843s
addons_test.go:214: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-20220921143604-3535 ssh "curl -s http://127.0.0.1/ -H 'Host: nginx.example.com'"
addons_test.go:238: (dbg) Run:  kubectl --context ingress-addon-legacy-20220921143604-3535 replace --force -f testdata/ingress-dns-example-v1beta1.yaml
addons_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-20220921143604-3535 ip
addons_test.go:249: (dbg) Run:  nslookup hello-john.test 192.168.64.5
addons_test.go:258: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-20220921143604-3535 addons disable ingress-dns --alsologtostderr -v=1
addons_test.go:258: (dbg) Done: out/minikube-darwin-amd64 -p ingress-addon-legacy-20220921143604-3535 addons disable ingress-dns --alsologtostderr -v=1: (2.513513476s)
addons_test.go:263: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-20220921143604-3535 addons disable ingress --alsologtostderr -v=1
addons_test.go:263: (dbg) Done: out/minikube-darwin-amd64 -p ingress-addon-legacy-20220921143604-3535 addons disable ingress --alsologtostderr -v=1: (7.193534591s)
--- PASS: TestIngressAddonLegacy/serial/ValidateIngressAddons (29.56s)

                                                
                                    
x
+
TestJSONOutput/start/Command (52.98s)

                                                
                                                
=== RUN   TestJSONOutput/start/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 start -p json-output-20220921143808-3535 --output=json --user=testUser --memory=2200 --wait=true --driver=hyperkit 
json_output_test.go:63: (dbg) Done: out/minikube-darwin-amd64 start -p json-output-20220921143808-3535 --output=json --user=testUser --memory=2200 --wait=true --driver=hyperkit : (52.976391228s)
--- PASS: TestJSONOutput/start/Command (52.98s)

                                                
                                    
x
+
TestJSONOutput/start/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/Audit
--- PASS: TestJSONOutput/start/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/start/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/start/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/Command (0.5s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 pause -p json-output-20220921143808-3535 --output=json --user=testUser
--- PASS: TestJSONOutput/pause/Command (0.50s)

                                                
                                    
x
+
TestJSONOutput/pause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Audit
--- PASS: TestJSONOutput/pause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/Command (0.43s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 unpause -p json-output-20220921143808-3535 --output=json --user=testUser
--- PASS: TestJSONOutput/unpause/Command (0.43s)

                                                
                                    
x
+
TestJSONOutput/unpause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Audit
--- PASS: TestJSONOutput/unpause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/Command (8.15s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 stop -p json-output-20220921143808-3535 --output=json --user=testUser
json_output_test.go:63: (dbg) Done: out/minikube-darwin-amd64 stop -p json-output-20220921143808-3535 --output=json --user=testUser: (8.147294119s)
--- PASS: TestJSONOutput/stop/Command (8.15s)

                                                
                                    
x
+
TestJSONOutput/stop/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Audit
--- PASS: TestJSONOutput/stop/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestErrorJSONOutput (0.76s)

                                                
                                                
=== RUN   TestErrorJSONOutput
json_output_test.go:149: (dbg) Run:  out/minikube-darwin-amd64 start -p json-output-error-20220921143911-3535 --memory=2200 --output=json --wait=true --driver=fail
json_output_test.go:149: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p json-output-error-20220921143911-3535 --memory=2200 --output=json --wait=true --driver=fail: exit status 56 (330.228581ms)

                                                
                                                
-- stdout --
	{"specversion":"1.0","id":"d88b1251-1b9f-42ba-904a-63bd1666efe2","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"0","message":"[json-output-error-20220921143911-3535] minikube v1.27.0 on Darwin 12.6","name":"Initial Minikube Setup","totalsteps":"19"}}
	{"specversion":"1.0","id":"c77e2793-8417-4304-bbc1-72e876184f27","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_LOCATION=14995"}}
	{"specversion":"1.0","id":"470daf24-ecf6-40ab-8886-5daa68626e83","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/kubeconfig"}}
	{"specversion":"1.0","id":"7f92a220-661f-442c-b86a-4d6a0d48b2ec","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_BIN=out/minikube-darwin-amd64"}}
	{"specversion":"1.0","id":"b66a7e1a-69a2-4d67-9abf-89e986082a55","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true"}}
	{"specversion":"1.0","id":"f3302963-a5f8-4f76-bead-ae31d5816007","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube"}}
	{"specversion":"1.0","id":"1f1d800f-72f9-4a50-a3bb-80e67b7e0bf7","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.error","datacontenttype":"application/json","data":{"advice":"","exitcode":"56","issues":"","message":"The driver 'fail' is not supported on darwin/amd64","name":"DRV_UNSUPPORTED_OS","url":""}}

                                                
                                                
-- /stdout --
helpers_test.go:175: Cleaning up "json-output-error-20220921143911-3535" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p json-output-error-20220921143911-3535
--- PASS: TestErrorJSONOutput (0.76s)

                                                
                                    
x
+
TestMainNoArgs (0.07s)

                                                
                                                
=== RUN   TestMainNoArgs
main_test.go:68: (dbg) Run:  out/minikube-darwin-amd64
--- PASS: TestMainNoArgs (0.07s)

                                                
                                    
x
+
TestMinikubeProfile (90.65s)

                                                
                                                
=== RUN   TestMinikubeProfile
minikube_profile_test.go:44: (dbg) Run:  out/minikube-darwin-amd64 start -p first-20220921143911-3535 --driver=hyperkit 
E0921 14:39:14.453901    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
E0921 14:39:42.165607    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
E0921 14:39:48.801825    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
E0921 14:39:48.806893    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
E0921 14:39:48.816996    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
E0921 14:39:48.837716    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
E0921 14:39:48.878137    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
E0921 14:39:48.958649    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
E0921 14:39:49.119625    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
E0921 14:39:49.440293    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
E0921 14:39:50.082355    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
E0921 14:39:51.405610    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
minikube_profile_test.go:44: (dbg) Done: out/minikube-darwin-amd64 start -p first-20220921143911-3535 --driver=hyperkit : (41.45823136s)
minikube_profile_test.go:44: (dbg) Run:  out/minikube-darwin-amd64 start -p second-20220921143911-3535 --driver=hyperkit 
E0921 14:39:53.966770    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
E0921 14:39:59.087612    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
E0921 14:40:09.328472    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
E0921 14:40:29.809595    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
minikube_profile_test.go:44: (dbg) Done: out/minikube-darwin-amd64 start -p second-20220921143911-3535 --driver=hyperkit : (39.550115097s)
minikube_profile_test.go:51: (dbg) Run:  out/minikube-darwin-amd64 profile first-20220921143911-3535
minikube_profile_test.go:55: (dbg) Run:  out/minikube-darwin-amd64 profile list -ojson
minikube_profile_test.go:51: (dbg) Run:  out/minikube-darwin-amd64 profile second-20220921143911-3535
minikube_profile_test.go:55: (dbg) Run:  out/minikube-darwin-amd64 profile list -ojson
helpers_test.go:175: Cleaning up "second-20220921143911-3535" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p second-20220921143911-3535
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p second-20220921143911-3535: (3.343082345s)
helpers_test.go:175: Cleaning up "first-20220921143911-3535" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p first-20220921143911-3535
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p first-20220921143911-3535: (5.272740129s)
--- PASS: TestMinikubeProfile (90.65s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountFirst (15.56s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountFirst
mount_start_test.go:98: (dbg) Run:  out/minikube-darwin-amd64 start -p mount-start-1-20220921144042-3535 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=hyperkit 
mount_start_test.go:98: (dbg) Done: out/minikube-darwin-amd64 start -p mount-start-1-20220921144042-3535 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=hyperkit : (14.564032185s)
--- PASS: TestMountStart/serial/StartWithMountFirst (15.56s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountFirst (0.27s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountFirst
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-1-20220921144042-3535 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-1-20220921144042-3535 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountFirst (0.27s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountSecond (16.84s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountSecond
mount_start_test.go:98: (dbg) Run:  out/minikube-darwin-amd64 start -p mount-start-2-20220921144042-3535 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=hyperkit 
E0921 14:41:10.770555    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
mount_start_test.go:98: (dbg) Done: out/minikube-darwin-amd64 start -p mount-start-2-20220921144042-3535 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=hyperkit : (15.841719804s)
--- PASS: TestMountStart/serial/StartWithMountSecond (16.84s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountSecond (0.27s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountSecond
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-20220921144042-3535 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-20220921144042-3535 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountSecond (0.27s)

                                                
                                    
x
+
TestMountStart/serial/DeleteFirst (2.36s)

                                                
                                                
=== RUN   TestMountStart/serial/DeleteFirst
pause_test.go:132: (dbg) Run:  out/minikube-darwin-amd64 delete -p mount-start-1-20220921144042-3535 --alsologtostderr -v=5
pause_test.go:132: (dbg) Done: out/minikube-darwin-amd64 delete -p mount-start-1-20220921144042-3535 --alsologtostderr -v=5: (2.361851683s)
--- PASS: TestMountStart/serial/DeleteFirst (2.36s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostDelete (0.27s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostDelete
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-20220921144042-3535 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-20220921144042-3535 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountPostDelete (0.27s)

                                                
                                    
x
+
TestMountStart/serial/Stop (2.23s)

                                                
                                                
=== RUN   TestMountStart/serial/Stop
mount_start_test.go:155: (dbg) Run:  out/minikube-darwin-amd64 stop -p mount-start-2-20220921144042-3535
mount_start_test.go:155: (dbg) Done: out/minikube-darwin-amd64 stop -p mount-start-2-20220921144042-3535: (2.226356046s)
--- PASS: TestMountStart/serial/Stop (2.23s)

                                                
                                    
x
+
TestMountStart/serial/RestartStopped (15.98s)

                                                
                                                
=== RUN   TestMountStart/serial/RestartStopped
mount_start_test.go:166: (dbg) Run:  out/minikube-darwin-amd64 start -p mount-start-2-20220921144042-3535
mount_start_test.go:166: (dbg) Done: out/minikube-darwin-amd64 start -p mount-start-2-20220921144042-3535: (14.975803802s)
--- PASS: TestMountStart/serial/RestartStopped (15.98s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostStop (0.28s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostStop
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-20220921144042-3535 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-20220921144042-3535 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountPostStop (0.28s)

                                                
                                    
x
+
TestMultiNode/serial/FreshStart2Nodes (120.8s)

                                                
                                                
=== RUN   TestMultiNode/serial/FreshStart2Nodes
multinode_test.go:83: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-20220921144139-3535 --wait=true --memory=2200 --nodes=2 -v=8 --alsologtostderr --driver=hyperkit 
E0921 14:42:32.694052    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
E0921 14:42:36.614797    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
E0921 14:42:36.619885    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
E0921 14:42:36.629964    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
E0921 14:42:36.651247    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
E0921 14:42:36.693457    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
E0921 14:42:36.774247    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
E0921 14:42:36.934373    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
E0921 14:42:37.254651    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
E0921 14:42:37.894897    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
E0921 14:42:39.175808    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
E0921 14:42:41.736015    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
E0921 14:42:46.857218    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
E0921 14:42:57.106419    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
E0921 14:43:17.594338    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
multinode_test.go:83: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-20220921144139-3535 --wait=true --memory=2200 --nodes=2 -v=8 --alsologtostderr --driver=hyperkit : (2m0.573068204s)
multinode_test.go:89: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 status --alsologtostderr
--- PASS: TestMultiNode/serial/FreshStart2Nodes (120.80s)

                                                
                                    
x
+
TestMultiNode/serial/DeployApp2Nodes (7.71s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeployApp2Nodes
multinode_test.go:479: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220921144139-3535 -- apply -f ./testdata/multinodes/multinode-pod-dns-test.yaml
multinode_test.go:484: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220921144139-3535 -- rollout status deployment/busybox
multinode_test.go:484: (dbg) Done: out/minikube-darwin-amd64 kubectl -p multinode-20220921144139-3535 -- rollout status deployment/busybox: (5.98986318s)
multinode_test.go:490: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220921144139-3535 -- get pods -o jsonpath='{.items[*].status.podIP}'
multinode_test.go:502: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220921144139-3535 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:510: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220921144139-3535 -- exec busybox-65db55d5d6-4ptlg -- nslookup kubernetes.io
multinode_test.go:510: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220921144139-3535 -- exec busybox-65db55d5d6-bnq4d -- nslookup kubernetes.io
multinode_test.go:520: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220921144139-3535 -- exec busybox-65db55d5d6-4ptlg -- nslookup kubernetes.default
multinode_test.go:520: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220921144139-3535 -- exec busybox-65db55d5d6-bnq4d -- nslookup kubernetes.default
multinode_test.go:528: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220921144139-3535 -- exec busybox-65db55d5d6-4ptlg -- nslookup kubernetes.default.svc.cluster.local
multinode_test.go:528: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220921144139-3535 -- exec busybox-65db55d5d6-bnq4d -- nslookup kubernetes.default.svc.cluster.local
--- PASS: TestMultiNode/serial/DeployApp2Nodes (7.71s)

                                                
                                    
x
+
TestMultiNode/serial/PingHostFrom2Pods (0.82s)

                                                
                                                
=== RUN   TestMultiNode/serial/PingHostFrom2Pods
multinode_test.go:538: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220921144139-3535 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:546: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220921144139-3535 -- exec busybox-65db55d5d6-4ptlg -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220921144139-3535 -- exec busybox-65db55d5d6-4ptlg -- sh -c "ping -c 1 192.168.64.1"
multinode_test.go:546: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220921144139-3535 -- exec busybox-65db55d5d6-bnq4d -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220921144139-3535 -- exec busybox-65db55d5d6-bnq4d -- sh -c "ping -c 1 192.168.64.1"
--- PASS: TestMultiNode/serial/PingHostFrom2Pods (0.82s)

                                                
                                    
x
+
TestMultiNode/serial/AddNode (43.31s)

                                                
                                                
=== RUN   TestMultiNode/serial/AddNode
multinode_test.go:108: (dbg) Run:  out/minikube-darwin-amd64 node add -p multinode-20220921144139-3535 -v 3 --alsologtostderr
E0921 14:43:58.559073    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
E0921 14:44:14.477224    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
multinode_test.go:108: (dbg) Done: out/minikube-darwin-amd64 node add -p multinode-20220921144139-3535 -v 3 --alsologtostderr: (43.010751233s)
multinode_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 status --alsologtostderr
--- PASS: TestMultiNode/serial/AddNode (43.31s)

                                                
                                    
x
+
TestMultiNode/serial/ProfileList (0.26s)

                                                
                                                
=== RUN   TestMultiNode/serial/ProfileList
multinode_test.go:130: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestMultiNode/serial/ProfileList (0.26s)

                                                
                                    
x
+
TestMultiNode/serial/CopyFile (5.08s)

                                                
                                                
=== RUN   TestMultiNode/serial/CopyFile
multinode_test.go:171: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 status --output json --alsologtostderr
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 cp testdata/cp-test.txt multinode-20220921144139-3535:/home/docker/cp-test.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 ssh -n multinode-20220921144139-3535 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 cp multinode-20220921144139-3535:/home/docker/cp-test.txt /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestMultiNodeserialCopyFile2484560095/001/cp-test_multinode-20220921144139-3535.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 ssh -n multinode-20220921144139-3535 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 cp multinode-20220921144139-3535:/home/docker/cp-test.txt multinode-20220921144139-3535-m02:/home/docker/cp-test_multinode-20220921144139-3535_multinode-20220921144139-3535-m02.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 ssh -n multinode-20220921144139-3535 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 ssh -n multinode-20220921144139-3535-m02 "sudo cat /home/docker/cp-test_multinode-20220921144139-3535_multinode-20220921144139-3535-m02.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 cp multinode-20220921144139-3535:/home/docker/cp-test.txt multinode-20220921144139-3535-m03:/home/docker/cp-test_multinode-20220921144139-3535_multinode-20220921144139-3535-m03.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 ssh -n multinode-20220921144139-3535 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 ssh -n multinode-20220921144139-3535-m03 "sudo cat /home/docker/cp-test_multinode-20220921144139-3535_multinode-20220921144139-3535-m03.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 cp testdata/cp-test.txt multinode-20220921144139-3535-m02:/home/docker/cp-test.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 ssh -n multinode-20220921144139-3535-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 cp multinode-20220921144139-3535-m02:/home/docker/cp-test.txt /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestMultiNodeserialCopyFile2484560095/001/cp-test_multinode-20220921144139-3535-m02.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 ssh -n multinode-20220921144139-3535-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 cp multinode-20220921144139-3535-m02:/home/docker/cp-test.txt multinode-20220921144139-3535:/home/docker/cp-test_multinode-20220921144139-3535-m02_multinode-20220921144139-3535.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 ssh -n multinode-20220921144139-3535-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 ssh -n multinode-20220921144139-3535 "sudo cat /home/docker/cp-test_multinode-20220921144139-3535-m02_multinode-20220921144139-3535.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 cp multinode-20220921144139-3535-m02:/home/docker/cp-test.txt multinode-20220921144139-3535-m03:/home/docker/cp-test_multinode-20220921144139-3535-m02_multinode-20220921144139-3535-m03.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 ssh -n multinode-20220921144139-3535-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 ssh -n multinode-20220921144139-3535-m03 "sudo cat /home/docker/cp-test_multinode-20220921144139-3535-m02_multinode-20220921144139-3535-m03.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 cp testdata/cp-test.txt multinode-20220921144139-3535-m03:/home/docker/cp-test.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 ssh -n multinode-20220921144139-3535-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 cp multinode-20220921144139-3535-m03:/home/docker/cp-test.txt /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestMultiNodeserialCopyFile2484560095/001/cp-test_multinode-20220921144139-3535-m03.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 ssh -n multinode-20220921144139-3535-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 cp multinode-20220921144139-3535-m03:/home/docker/cp-test.txt multinode-20220921144139-3535:/home/docker/cp-test_multinode-20220921144139-3535-m03_multinode-20220921144139-3535.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 ssh -n multinode-20220921144139-3535-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 ssh -n multinode-20220921144139-3535 "sudo cat /home/docker/cp-test_multinode-20220921144139-3535-m03_multinode-20220921144139-3535.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 cp multinode-20220921144139-3535-m03:/home/docker/cp-test.txt multinode-20220921144139-3535-m02:/home/docker/cp-test_multinode-20220921144139-3535-m03_multinode-20220921144139-3535-m02.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 ssh -n multinode-20220921144139-3535-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 ssh -n multinode-20220921144139-3535-m02 "sudo cat /home/docker/cp-test_multinode-20220921144139-3535-m03_multinode-20220921144139-3535-m02.txt"
--- PASS: TestMultiNode/serial/CopyFile (5.08s)

                                                
                                    
x
+
TestMultiNode/serial/StopNode (2.66s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopNode
multinode_test.go:208: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 node stop m03
multinode_test.go:208: (dbg) Done: out/minikube-darwin-amd64 -p multinode-20220921144139-3535 node stop m03: (2.198803373s)
multinode_test.go:214: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 status
multinode_test.go:214: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-20220921144139-3535 status: exit status 7 (228.457693ms)

                                                
                                                
-- stdout --
	multinode-20220921144139-3535
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-20220921144139-3535-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-20220921144139-3535-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:221: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 status --alsologtostderr
multinode_test.go:221: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-20220921144139-3535 status --alsologtostderr: exit status 7 (231.438619ms)

                                                
                                                
-- stdout --
	multinode-20220921144139-3535
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-20220921144139-3535-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-20220921144139-3535-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0921 14:44:39.887071    6596 out.go:296] Setting OutFile to fd 1 ...
	I0921 14:44:39.887227    6596 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0921 14:44:39.887232    6596 out.go:309] Setting ErrFile to fd 2...
	I0921 14:44:39.887236    6596 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0921 14:44:39.887335    6596 root.go:333] Updating PATH: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/bin
	I0921 14:44:39.887504    6596 out.go:303] Setting JSON to false
	I0921 14:44:39.887519    6596 mustload.go:65] Loading cluster: multinode-20220921144139-3535
	I0921 14:44:39.887795    6596 config.go:180] Loaded profile config "multinode-20220921144139-3535": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.2
	I0921 14:44:39.887807    6596 status.go:253] checking status of multinode-20220921144139-3535 ...
	I0921 14:44:39.888141    6596 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 14:44:39.888187    6596 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 14:44:39.894311    6596 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:51307
	I0921 14:44:39.894717    6596 main.go:134] libmachine: () Calling .GetVersion
	I0921 14:44:39.895166    6596 main.go:134] libmachine: Using API Version  1
	I0921 14:44:39.895177    6596 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 14:44:39.895390    6596 main.go:134] libmachine: () Calling .GetMachineName
	I0921 14:44:39.895485    6596 main.go:134] libmachine: (multinode-20220921144139-3535) Calling .GetState
	I0921 14:44:39.895567    6596 main.go:134] libmachine: (multinode-20220921144139-3535) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0921 14:44:39.895641    6596 main.go:134] libmachine: (multinode-20220921144139-3535) DBG | hyperkit pid from json: 6140
	I0921 14:44:39.896588    6596 status.go:328] multinode-20220921144139-3535 host status = "Running" (err=<nil>)
	I0921 14:44:39.896606    6596 host.go:66] Checking if "multinode-20220921144139-3535" exists ...
	I0921 14:44:39.896850    6596 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 14:44:39.896876    6596 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 14:44:39.902975    6596 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:51309
	I0921 14:44:39.903327    6596 main.go:134] libmachine: () Calling .GetVersion
	I0921 14:44:39.903658    6596 main.go:134] libmachine: Using API Version  1
	I0921 14:44:39.903673    6596 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 14:44:39.903862    6596 main.go:134] libmachine: () Calling .GetMachineName
	I0921 14:44:39.903953    6596 main.go:134] libmachine: (multinode-20220921144139-3535) Calling .GetIP
	I0921 14:44:39.904026    6596 host.go:66] Checking if "multinode-20220921144139-3535" exists ...
	I0921 14:44:39.904328    6596 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 14:44:39.904348    6596 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 14:44:39.910274    6596 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:51311
	I0921 14:44:39.910627    6596 main.go:134] libmachine: () Calling .GetVersion
	I0921 14:44:39.910943    6596 main.go:134] libmachine: Using API Version  1
	I0921 14:44:39.910955    6596 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 14:44:39.911138    6596 main.go:134] libmachine: () Calling .GetMachineName
	I0921 14:44:39.911232    6596 main.go:134] libmachine: (multinode-20220921144139-3535) Calling .DriverName
	I0921 14:44:39.911360    6596 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0921 14:44:39.911381    6596 main.go:134] libmachine: (multinode-20220921144139-3535) Calling .GetSSHHostname
	I0921 14:44:39.911460    6596 main.go:134] libmachine: (multinode-20220921144139-3535) Calling .GetSSHPort
	I0921 14:44:39.911549    6596 main.go:134] libmachine: (multinode-20220921144139-3535) Calling .GetSSHKeyPath
	I0921 14:44:39.911623    6596 main.go:134] libmachine: (multinode-20220921144139-3535) Calling .GetSSHUsername
	I0921 14:44:39.911691    6596 sshutil.go:53] new ssh client: &{IP:192.168.64.11 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/multinode-20220921144139-3535/id_rsa Username:docker}
	I0921 14:44:39.945536    6596 ssh_runner.go:195] Run: systemctl --version
	I0921 14:44:39.948910    6596 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0921 14:44:39.958210    6596 kubeconfig.go:92] found "multinode-20220921144139-3535" server: "https://192.168.64.11:8443"
	I0921 14:44:39.958230    6596 api_server.go:165] Checking apiserver status ...
	I0921 14:44:39.958268    6596 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0921 14:44:39.967276    6596 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1753/cgroup
	I0921 14:44:39.973652    6596 api_server.go:181] apiserver freezer: "4:freezer:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc5a887d81a80a220ba989c69ae9ab54.slice/docker-df2924b7a7b44937b2f220e77da36801405efdfd87a5bc4e1a055de297603517.scope"
	I0921 14:44:39.973696    6596 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc5a887d81a80a220ba989c69ae9ab54.slice/docker-df2924b7a7b44937b2f220e77da36801405efdfd87a5bc4e1a055de297603517.scope/freezer.state
	I0921 14:44:39.981153    6596 api_server.go:203] freezer state: "THAWED"
	I0921 14:44:39.981167    6596 api_server.go:240] Checking apiserver healthz at https://192.168.64.11:8443/healthz ...
	I0921 14:44:39.985185    6596 api_server.go:266] https://192.168.64.11:8443/healthz returned 200:
	ok
	I0921 14:44:39.985196    6596 status.go:419] multinode-20220921144139-3535 apiserver status = Running (err=<nil>)
	I0921 14:44:39.985203    6596 status.go:255] multinode-20220921144139-3535 status: &{Name:multinode-20220921144139-3535 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0921 14:44:39.985220    6596 status.go:253] checking status of multinode-20220921144139-3535-m02 ...
	I0921 14:44:39.985468    6596 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 14:44:39.985507    6596 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 14:44:39.991604    6596 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:51315
	I0921 14:44:39.991968    6596 main.go:134] libmachine: () Calling .GetVersion
	I0921 14:44:39.992269    6596 main.go:134] libmachine: Using API Version  1
	I0921 14:44:39.992280    6596 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 14:44:39.992457    6596 main.go:134] libmachine: () Calling .GetMachineName
	I0921 14:44:39.992550    6596 main.go:134] libmachine: (multinode-20220921144139-3535-m02) Calling .GetState
	I0921 14:44:39.992629    6596 main.go:134] libmachine: (multinode-20220921144139-3535-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0921 14:44:39.992705    6596 main.go:134] libmachine: (multinode-20220921144139-3535-m02) DBG | hyperkit pid from json: 6218
	I0921 14:44:39.993599    6596 status.go:328] multinode-20220921144139-3535-m02 host status = "Running" (err=<nil>)
	I0921 14:44:39.993605    6596 host.go:66] Checking if "multinode-20220921144139-3535-m02" exists ...
	I0921 14:44:39.993872    6596 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 14:44:39.993891    6596 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 14:44:39.999718    6596 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:51317
	I0921 14:44:40.000082    6596 main.go:134] libmachine: () Calling .GetVersion
	I0921 14:44:40.000395    6596 main.go:134] libmachine: Using API Version  1
	I0921 14:44:40.000408    6596 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 14:44:40.000627    6596 main.go:134] libmachine: () Calling .GetMachineName
	I0921 14:44:40.000730    6596 main.go:134] libmachine: (multinode-20220921144139-3535-m02) Calling .GetIP
	I0921 14:44:40.000800    6596 host.go:66] Checking if "multinode-20220921144139-3535-m02" exists ...
	I0921 14:44:40.001075    6596 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 14:44:40.001098    6596 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 14:44:40.007132    6596 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:51319
	I0921 14:44:40.007527    6596 main.go:134] libmachine: () Calling .GetVersion
	I0921 14:44:40.007827    6596 main.go:134] libmachine: Using API Version  1
	I0921 14:44:40.007838    6596 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 14:44:40.008057    6596 main.go:134] libmachine: () Calling .GetMachineName
	I0921 14:44:40.008159    6596 main.go:134] libmachine: (multinode-20220921144139-3535-m02) Calling .DriverName
	I0921 14:44:40.008282    6596 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0921 14:44:40.008294    6596 main.go:134] libmachine: (multinode-20220921144139-3535-m02) Calling .GetSSHHostname
	I0921 14:44:40.008372    6596 main.go:134] libmachine: (multinode-20220921144139-3535-m02) Calling .GetSSHPort
	I0921 14:44:40.008458    6596 main.go:134] libmachine: (multinode-20220921144139-3535-m02) Calling .GetSSHKeyPath
	I0921 14:44:40.008540    6596 main.go:134] libmachine: (multinode-20220921144139-3535-m02) Calling .GetSSHUsername
	I0921 14:44:40.008622    6596 sshutil.go:53] new ssh client: &{IP:192.168.64.12 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/machines/multinode-20220921144139-3535-m02/id_rsa Username:docker}
	I0921 14:44:40.049867    6596 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0921 14:44:40.058160    6596 status.go:255] multinode-20220921144139-3535-m02 status: &{Name:multinode-20220921144139-3535-m02 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}
	I0921 14:44:40.058178    6596 status.go:253] checking status of multinode-20220921144139-3535-m03 ...
	I0921 14:44:40.058448    6596 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 14:44:40.058469    6596 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 14:44:40.064712    6596 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:51322
	I0921 14:44:40.065174    6596 main.go:134] libmachine: () Calling .GetVersion
	I0921 14:44:40.065539    6596 main.go:134] libmachine: Using API Version  1
	I0921 14:44:40.065551    6596 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 14:44:40.065787    6596 main.go:134] libmachine: () Calling .GetMachineName
	I0921 14:44:40.065882    6596 main.go:134] libmachine: (multinode-20220921144139-3535-m03) Calling .GetState
	I0921 14:44:40.065953    6596 main.go:134] libmachine: (multinode-20220921144139-3535-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0921 14:44:40.066040    6596 main.go:134] libmachine: (multinode-20220921144139-3535-m03) DBG | hyperkit pid from json: 6346
	I0921 14:44:40.066971    6596 main.go:134] libmachine: (multinode-20220921144139-3535-m03) DBG | hyperkit pid 6346 missing from process table
	I0921 14:44:40.066991    6596 status.go:328] multinode-20220921144139-3535-m03 host status = "Stopped" (err=<nil>)
	I0921 14:44:40.066998    6596 status.go:341] host is not running, skipping remaining checks
	I0921 14:44:40.067002    6596 status.go:255] multinode-20220921144139-3535-m03 status: &{Name:multinode-20220921144139-3535-m03 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopNode (2.66s)

                                                
                                    
x
+
TestMultiNode/serial/StartAfterStop (33.46s)

                                                
                                                
=== RUN   TestMultiNode/serial/StartAfterStop
multinode_test.go:252: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 node start m03 --alsologtostderr
E0921 14:44:48.825234    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
multinode_test.go:252: (dbg) Done: out/minikube-darwin-amd64 -p multinode-20220921144139-3535 node start m03 --alsologtostderr: (33.117394098s)
multinode_test.go:259: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 status
multinode_test.go:273: (dbg) Run:  kubectl get nodes
--- PASS: TestMultiNode/serial/StartAfterStop (33.46s)

                                                
                                    
x
+
TestMultiNode/serial/RestartKeepsNodes (910.6s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartKeepsNodes
multinode_test.go:281: (dbg) Run:  out/minikube-darwin-amd64 node list -p multinode-20220921144139-3535
multinode_test.go:288: (dbg) Run:  out/minikube-darwin-amd64 stop -p multinode-20220921144139-3535
E0921 14:45:16.555423    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
E0921 14:45:20.482195    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
multinode_test.go:288: (dbg) Done: out/minikube-darwin-amd64 stop -p multinode-20220921144139-3535: (12.360889119s)
multinode_test.go:293: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-20220921144139-3535 --wait=true -v=8 --alsologtostderr
E0921 14:47:36.635827    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
E0921 14:48:04.326425    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
E0921 14:49:14.481018    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
E0921 14:49:48.828709    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
E0921 14:50:37.554965    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
E0921 14:52:36.641432    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
E0921 14:54:14.485608    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
E0921 14:54:48.832263    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
E0921 14:56:11.923644    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
E0921 14:57:36.643527    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
E0921 14:58:59.694663    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
E0921 14:59:14.489227    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
E0921 14:59:48.835777    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
multinode_test.go:293: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-20220921144139-3535 --wait=true -v=8 --alsologtostderr: (14m58.136117399s)
multinode_test.go:298: (dbg) Run:  out/minikube-darwin-amd64 node list -p multinode-20220921144139-3535
--- PASS: TestMultiNode/serial/RestartKeepsNodes (910.60s)

                                                
                                    
x
+
TestMultiNode/serial/DeleteNode (4.95s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeleteNode
multinode_test.go:392: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 node delete m03
multinode_test.go:392: (dbg) Done: out/minikube-darwin-amd64 -p multinode-20220921144139-3535 node delete m03: (4.63746934s)
multinode_test.go:398: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 status --alsologtostderr
multinode_test.go:422: (dbg) Run:  kubectl get nodes
multinode_test.go:430: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/DeleteNode (4.95s)

                                                
                                    
x
+
TestMultiNode/serial/StopMultiNode (4.48s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopMultiNode
multinode_test.go:312: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 stop
multinode_test.go:312: (dbg) Done: out/minikube-darwin-amd64 -p multinode-20220921144139-3535 stop: (4.342491195s)
multinode_test.go:318: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 status
multinode_test.go:318: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-20220921144139-3535 status: exit status 7 (67.61124ms)

                                                
                                                
-- stdout --
	multinode-20220921144139-3535
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-20220921144139-3535-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 status --alsologtostderr
multinode_test.go:325: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-20220921144139-3535 status --alsologtostderr: exit status 7 (67.504186ms)

                                                
                                                
-- stdout --
	multinode-20220921144139-3535
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-20220921144139-3535-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0921 15:00:33.545869    7676 out.go:296] Setting OutFile to fd 1 ...
	I0921 15:00:33.546048    7676 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0921 15:00:33.546053    7676 out.go:309] Setting ErrFile to fd 2...
	I0921 15:00:33.546057    7676 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0921 15:00:33.546168    7676 root.go:333] Updating PATH: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/bin
	I0921 15:00:33.546338    7676 out.go:303] Setting JSON to false
	I0921 15:00:33.546353    7676 mustload.go:65] Loading cluster: multinode-20220921144139-3535
	I0921 15:00:33.546672    7676 config.go:180] Loaded profile config "multinode-20220921144139-3535": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.2
	I0921 15:00:33.546685    7676 status.go:253] checking status of multinode-20220921144139-3535 ...
	I0921 15:00:33.547010    7676 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:00:33.547060    7676 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:00:33.552956    7676 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:51524
	I0921 15:00:33.553329    7676 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:00:33.553716    7676 main.go:134] libmachine: Using API Version  1
	I0921 15:00:33.553727    7676 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:00:33.553984    7676 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:00:33.554088    7676 main.go:134] libmachine: (multinode-20220921144139-3535) Calling .GetState
	I0921 15:00:33.554164    7676 main.go:134] libmachine: (multinode-20220921144139-3535) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0921 15:00:33.554234    7676 main.go:134] libmachine: (multinode-20220921144139-3535) DBG | hyperkit pid from json: 6697
	I0921 15:00:33.554921    7676 main.go:134] libmachine: (multinode-20220921144139-3535) DBG | hyperkit pid 6697 missing from process table
	I0921 15:00:33.554955    7676 status.go:328] multinode-20220921144139-3535 host status = "Stopped" (err=<nil>)
	I0921 15:00:33.554963    7676 status.go:341] host is not running, skipping remaining checks
	I0921 15:00:33.554967    7676 status.go:255] multinode-20220921144139-3535 status: &{Name:multinode-20220921144139-3535 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0921 15:00:33.554985    7676 status.go:253] checking status of multinode-20220921144139-3535-m02 ...
	I0921 15:00:33.555230    7676 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0921 15:00:33.555250    7676 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0921 15:00:33.561132    7676 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:51526
	I0921 15:00:33.561452    7676 main.go:134] libmachine: () Calling .GetVersion
	I0921 15:00:33.561772    7676 main.go:134] libmachine: Using API Version  1
	I0921 15:00:33.561784    7676 main.go:134] libmachine: () Calling .SetConfigRaw
	I0921 15:00:33.561971    7676 main.go:134] libmachine: () Calling .GetMachineName
	I0921 15:00:33.562058    7676 main.go:134] libmachine: (multinode-20220921144139-3535-m02) Calling .GetState
	I0921 15:00:33.562129    7676 main.go:134] libmachine: (multinode-20220921144139-3535-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0921 15:00:33.562206    7676 main.go:134] libmachine: (multinode-20220921144139-3535-m02) DBG | hyperkit pid from json: 7027
	I0921 15:00:33.562908    7676 main.go:134] libmachine: (multinode-20220921144139-3535-m02) DBG | hyperkit pid 7027 missing from process table
	I0921 15:00:33.562950    7676 status.go:328] multinode-20220921144139-3535-m02 host status = "Stopped" (err=<nil>)
	I0921 15:00:33.562963    7676 status.go:341] host is not running, skipping remaining checks
	I0921 15:00:33.562969    7676 status.go:255] multinode-20220921144139-3535-m02 status: &{Name:multinode-20220921144139-3535-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopMultiNode (4.48s)

                                                
                                    
x
+
TestMultiNode/serial/RestartMultiNode (553.86s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartMultiNode
multinode_test.go:352: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-20220921144139-3535 --wait=true -v=8 --alsologtostderr --driver=hyperkit 
E0921 15:02:36.660528    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
E0921 15:04:14.505481    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
E0921 15:04:48.855354    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
E0921 15:07:17.582578    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
E0921 15:07:36.664086    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
E0921 15:09:14.509080    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
multinode_test.go:352: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-20220921144139-3535 --wait=true -v=8 --alsologtostderr --driver=hyperkit : (9m13.545750314s)
multinode_test.go:358: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220921144139-3535 status --alsologtostderr
multinode_test.go:372: (dbg) Run:  kubectl get nodes
multinode_test.go:380: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/RestartMultiNode (553.86s)

                                                
                                    
x
+
TestMultiNode/serial/ValidateNameConflict (47.36s)

                                                
                                                
=== RUN   TestMultiNode/serial/ValidateNameConflict
multinode_test.go:441: (dbg) Run:  out/minikube-darwin-amd64 node list -p multinode-20220921144139-3535
multinode_test.go:450: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-20220921144139-3535-m02 --driver=hyperkit 
multinode_test.go:450: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p multinode-20220921144139-3535-m02 --driver=hyperkit : exit status 14 (345.240496ms)

                                                
                                                
-- stdout --
	* [multinode-20220921144139-3535-m02] minikube v1.27.0 on Darwin 12.6
	  - MINIKUBE_LOCATION=14995
	  - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! Profile name 'multinode-20220921144139-3535-m02' is duplicated with machine name 'multinode-20220921144139-3535-m02' in profile 'multinode-20220921144139-3535'
	X Exiting due to MK_USAGE: Profile name should be unique

                                                
                                                
** /stderr **
multinode_test.go:458: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-20220921144139-3535-m03 --driver=hyperkit 
E0921 15:09:48.856918    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
multinode_test.go:458: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-20220921144139-3535-m03 --driver=hyperkit : (41.40759785s)
multinode_test.go:465: (dbg) Run:  out/minikube-darwin-amd64 node add -p multinode-20220921144139-3535
multinode_test.go:465: (dbg) Non-zero exit: out/minikube-darwin-amd64 node add -p multinode-20220921144139-3535: exit status 80 (285.476954ms)

                                                
                                                
-- stdout --
	* Adding node m03 to cluster multinode-20220921144139-3535
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to GUEST_NODE_ADD: Node multinode-20220921144139-3535-m03 already exists in multinode-20220921144139-3535-m03 profile
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                                                         │
	│    * If the above advice does not help, please let us know:                                                             │
	│      https://github.com/kubernetes/minikube/issues/new/choose                                                           │
	│                                                                                                                         │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.                                │
	│    * Please also attach the following file to the GitHub issue:                                                         │
	│    * - /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube_node_040ea7097fd6ed71e65be9a474587f81f0ccd21d_0.log    │
	│                                                                                                                         │
	╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
multinode_test.go:470: (dbg) Run:  out/minikube-darwin-amd64 delete -p multinode-20220921144139-3535-m03
multinode_test.go:470: (dbg) Done: out/minikube-darwin-amd64 delete -p multinode-20220921144139-3535-m03: (5.26812951s)
--- PASS: TestMultiNode/serial/ValidateNameConflict (47.36s)

                                                
                                    
x
+
TestPreload (172.24s)

                                                
                                                
=== RUN   TestPreload
preload_test.go:48: (dbg) Run:  out/minikube-darwin-amd64 start -p test-preload-20220921151039-3535 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.17.0
preload_test.go:48: (dbg) Done: out/minikube-darwin-amd64 start -p test-preload-20220921151039-3535 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.17.0: (1m33.028118927s)
preload_test.go:61: (dbg) Run:  out/minikube-darwin-amd64 ssh -p test-preload-20220921151039-3535 -- docker pull gcr.io/k8s-minikube/busybox
preload_test.go:61: (dbg) Done: out/minikube-darwin-amd64 ssh -p test-preload-20220921151039-3535 -- docker pull gcr.io/k8s-minikube/busybox: (4.237170124s)
preload_test.go:71: (dbg) Run:  out/minikube-darwin-amd64 start -p test-preload-20220921151039-3535 --memory=2200 --alsologtostderr -v=1 --wait=true --driver=hyperkit  --kubernetes-version=v1.17.3
E0921 15:12:36.667669    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
E0921 15:12:51.950941    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
preload_test.go:71: (dbg) Done: out/minikube-darwin-amd64 start -p test-preload-20220921151039-3535 --memory=2200 --alsologtostderr -v=1 --wait=true --driver=hyperkit  --kubernetes-version=v1.17.3: (1m9.541188676s)
preload_test.go:80: (dbg) Run:  out/minikube-darwin-amd64 ssh -p test-preload-20220921151039-3535 -- docker images
helpers_test.go:175: Cleaning up "test-preload-20220921151039-3535" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p test-preload-20220921151039-3535
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p test-preload-20220921151039-3535: (5.274842641s)
--- PASS: TestPreload (172.24s)

                                                
                                    
x
+
TestScheduledStopUnix (112.59s)

                                                
                                                
=== RUN   TestScheduledStopUnix
scheduled_stop_test.go:128: (dbg) Run:  out/minikube-darwin-amd64 start -p scheduled-stop-20220921151331-3535 --memory=2048 --driver=hyperkit 
scheduled_stop_test.go:128: (dbg) Done: out/minikube-darwin-amd64 start -p scheduled-stop-20220921151331-3535 --memory=2048 --driver=hyperkit : (41.226644796s)
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-20220921151331-3535 --schedule 5m
scheduled_stop_test.go:191: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.TimeToStop}} -p scheduled-stop-20220921151331-3535 -n scheduled-stop-20220921151331-3535
scheduled_stop_test.go:169: signal error was:  <nil>
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-20220921151331-3535 --schedule 15s
scheduled_stop_test.go:169: signal error was:  os: process already finished
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-20220921151331-3535 --cancel-scheduled
E0921 15:14:14.512777    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
scheduled_stop_test.go:176: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p scheduled-stop-20220921151331-3535 -n scheduled-stop-20220921151331-3535
scheduled_stop_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 status -p scheduled-stop-20220921151331-3535
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-20220921151331-3535 --schedule 15s
E0921 15:14:48.861707    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
scheduled_stop_test.go:169: signal error was:  os: process already finished
scheduled_stop_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 status -p scheduled-stop-20220921151331-3535
scheduled_stop_test.go:205: (dbg) Non-zero exit: out/minikube-darwin-amd64 status -p scheduled-stop-20220921151331-3535: exit status 7 (61.729468ms)

                                                
                                                
-- stdout --
	scheduled-stop-20220921151331-3535
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	

                                                
                                                
-- /stdout --
scheduled_stop_test.go:176: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p scheduled-stop-20220921151331-3535 -n scheduled-stop-20220921151331-3535
scheduled_stop_test.go:176: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p scheduled-stop-20220921151331-3535 -n scheduled-stop-20220921151331-3535: exit status 7 (60.008563ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
scheduled_stop_test.go:176: status error: exit status 7 (may be ok)
helpers_test.go:175: Cleaning up "scheduled-stop-20220921151331-3535" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p scheduled-stop-20220921151331-3535
--- PASS: TestScheduledStopUnix (112.59s)

                                                
                                    
x
+
TestSkaffold (73.14s)

                                                
                                                
=== RUN   TestSkaffold
skaffold_test.go:59: (dbg) Run:  /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/skaffold.exe3824895629 version
skaffold_test.go:63: skaffold version: v1.39.2
skaffold_test.go:66: (dbg) Run:  out/minikube-darwin-amd64 start -p skaffold-20220921151524-3535 --memory=2600 --driver=hyperkit 
E0921 15:15:39.721807    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
skaffold_test.go:66: (dbg) Done: out/minikube-darwin-amd64 start -p skaffold-20220921151524-3535 --memory=2600 --driver=hyperkit : (39.749323921s)
skaffold_test.go:86: copying out/minikube-darwin-amd64 to /Users/jenkins/workspace/out/minikube
skaffold_test.go:110: (dbg) Run:  /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/skaffold.exe3824895629 run --minikube-profile skaffold-20220921151524-3535 --kube-context skaffold-20220921151524-3535 --status-check=true --port-forward=false --interactive=false
skaffold_test.go:110: (dbg) Done: /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/skaffold.exe3824895629 run --minikube-profile skaffold-20220921151524-3535 --kube-context skaffold-20220921151524-3535 --status-check=true --port-forward=false --interactive=false: (16.252928944s)
skaffold_test.go:116: (dbg) TestSkaffold: waiting 1m0s for pods matching "app=leeroy-app" in namespace "default" ...
helpers_test.go:342: "leeroy-app-6f94fd8f78-s5kw9" [bfc37693-2b6f-4e89-9089-fd947336b9e4] Running
skaffold_test.go:116: (dbg) TestSkaffold: app=leeroy-app healthy within 5.010027794s
skaffold_test.go:119: (dbg) TestSkaffold: waiting 1m0s for pods matching "app=leeroy-web" in namespace "default" ...
helpers_test.go:342: "leeroy-web-7947bcc67c-n2h5f" [4add07e9-910b-4569-866c-1337a67bbb7f] Running
skaffold_test.go:119: (dbg) TestSkaffold: app=leeroy-web healthy within 5.005703013s
helpers_test.go:175: Cleaning up "skaffold-20220921151524-3535" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p skaffold-20220921151524-3535
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p skaffold-20220921151524-3535: (5.282726728s)
--- PASS: TestSkaffold (73.14s)

                                                
                                    
x
+
TestRunningBinaryUpgrade (168.71s)

                                                
                                                
=== RUN   TestRunningBinaryUpgrade
=== PAUSE TestRunningBinaryUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestRunningBinaryUpgrade
version_upgrade_test.go:127: (dbg) Run:  /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube-v1.6.2.3279544437.exe start -p running-upgrade-20220921152233-3535 --memory=2200 --vm-driver=hyperkit 
E0921 15:22:36.643139    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
E0921 15:22:44.069834    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/skaffold-20220921151524-3535/client.crt: no such file or directory

                                                
                                                
=== CONT  TestRunningBinaryUpgrade
version_upgrade_test.go:127: (dbg) Done: /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube-v1.6.2.3279544437.exe start -p running-upgrade-20220921152233-3535 --memory=2200 --vm-driver=hyperkit : (1m40.601255048s)
version_upgrade_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 start -p running-upgrade-20220921152233-3535 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit 

                                                
                                                
=== CONT  TestRunningBinaryUpgrade
version_upgrade_test.go:137: (dbg) Done: out/minikube-darwin-amd64 start -p running-upgrade-20220921152233-3535 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit : (1m2.220294658s)
helpers_test.go:175: Cleaning up "running-upgrade-20220921152233-3535" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p running-upgrade-20220921152233-3535

                                                
                                                
=== CONT  TestRunningBinaryUpgrade
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p running-upgrade-20220921152233-3535: (5.27672581s)
--- PASS: TestRunningBinaryUpgrade (168.71s)

                                                
                                    
x
+
TestKubernetesUpgrade (139.64s)

                                                
                                                
=== RUN   TestKubernetesUpgrade
=== PAUSE TestKubernetesUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestKubernetesUpgrade
version_upgrade_test.go:229: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-20220921151918-3535 --memory=2200 --kubernetes-version=v1.16.0 --alsologtostderr -v=1 --driver=hyperkit 
E0921 15:19:48.834511    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
version_upgrade_test.go:229: (dbg) Done: out/minikube-darwin-amd64 start -p kubernetes-upgrade-20220921151918-3535 --memory=2200 --kubernetes-version=v1.16.0 --alsologtostderr -v=1 --driver=hyperkit : (1m10.43915443s)
version_upgrade_test.go:234: (dbg) Run:  out/minikube-darwin-amd64 stop -p kubernetes-upgrade-20220921151918-3535
version_upgrade_test.go:234: (dbg) Done: out/minikube-darwin-amd64 stop -p kubernetes-upgrade-20220921151918-3535: (2.243557998s)
version_upgrade_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 -p kubernetes-upgrade-20220921151918-3535 status --format={{.Host}}
version_upgrade_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p kubernetes-upgrade-20220921151918-3535 status --format={{.Host}}: exit status 7 (60.651432ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
version_upgrade_test.go:241: status error: exit status 7 (may be ok)
version_upgrade_test.go:250: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-20220921151918-3535 --memory=2200 --kubernetes-version=v1.25.2 --alsologtostderr -v=1 --driver=hyperkit 
version_upgrade_test.go:250: (dbg) Done: out/minikube-darwin-amd64 start -p kubernetes-upgrade-20220921151918-3535 --memory=2200 --kubernetes-version=v1.25.2 --alsologtostderr -v=1 --driver=hyperkit : (39.060865873s)
version_upgrade_test.go:255: (dbg) Run:  kubectl --context kubernetes-upgrade-20220921151918-3535 version --output=json
version_upgrade_test.go:274: Attempting to downgrade Kubernetes (should fail)
version_upgrade_test.go:276: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-20220921151918-3535 --memory=2200 --kubernetes-version=v1.16.0 --driver=hyperkit 
version_upgrade_test.go:276: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p kubernetes-upgrade-20220921151918-3535 --memory=2200 --kubernetes-version=v1.16.0 --driver=hyperkit : exit status 106 (559.727842ms)

                                                
                                                
-- stdout --
	* [kubernetes-upgrade-20220921151918-3535] minikube v1.27.0 on Darwin 12.6
	  - MINIKUBE_LOCATION=14995
	  - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to K8S_DOWNGRADE_UNSUPPORTED: Unable to safely downgrade existing Kubernetes v1.25.2 cluster to v1.16.0
	* Suggestion: 
	
	    1) Recreate the cluster with Kubernetes 1.16.0, by running:
	    
	    minikube delete -p kubernetes-upgrade-20220921151918-3535
	    minikube start -p kubernetes-upgrade-20220921151918-3535 --kubernetes-version=v1.16.0
	    
	    2) Create a second cluster with Kubernetes 1.16.0, by running:
	    
	    minikube start -p kubernetes-upgrade-20220921151918-35352 --kubernetes-version=v1.16.0
	    
	    3) Use the existing cluster at version Kubernetes 1.25.2, by running:
	    
	    minikube start -p kubernetes-upgrade-20220921151918-3535 --kubernetes-version=v1.25.2
	    

                                                
                                                
** /stderr **
version_upgrade_test.go:280: Attempting restart after unsuccessful downgrade
version_upgrade_test.go:282: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-20220921151918-3535 --memory=2200 --kubernetes-version=v1.25.2 --alsologtostderr -v=1 --driver=hyperkit 
E0921 15:21:22.139211    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/skaffold-20220921151524-3535/client.crt: no such file or directory
E0921 15:21:22.144446    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/skaffold-20220921151524-3535/client.crt: no such file or directory
E0921 15:21:22.155016    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/skaffold-20220921151524-3535/client.crt: no such file or directory
E0921 15:21:22.177181    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/skaffold-20220921151524-3535/client.crt: no such file or directory
E0921 15:21:22.217842    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/skaffold-20220921151524-3535/client.crt: no such file or directory
E0921 15:21:22.298138    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/skaffold-20220921151524-3535/client.crt: no such file or directory
E0921 15:21:22.460305    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/skaffold-20220921151524-3535/client.crt: no such file or directory
E0921 15:21:22.781391    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/skaffold-20220921151524-3535/client.crt: no such file or directory
E0921 15:21:23.421532    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/skaffold-20220921151524-3535/client.crt: no such file or directory
E0921 15:21:24.702896    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/skaffold-20220921151524-3535/client.crt: no such file or directory
E0921 15:21:27.265144    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/skaffold-20220921151524-3535/client.crt: no such file or directory
E0921 15:21:32.386197    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/skaffold-20220921151524-3535/client.crt: no such file or directory
version_upgrade_test.go:282: (dbg) Done: out/minikube-darwin-amd64 start -p kubernetes-upgrade-20220921151918-3535 --memory=2200 --kubernetes-version=v1.25.2 --alsologtostderr -v=1 --driver=hyperkit : (21.965884427s)
helpers_test.go:175: Cleaning up "kubernetes-upgrade-20220921151918-3535" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p kubernetes-upgrade-20220921151918-3535
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p kubernetes-upgrade-20220921151918-3535: (5.26574317s)
--- PASS: TestKubernetesUpgrade (139.64s)

                                                
                                    
x
+
TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current (4.07s)

                                                
                                                
=== RUN   TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current
* minikube v1.27.0 on darwin
- MINIKUBE_LOCATION=14995
- KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/kubeconfig
- MINIKUBE_BIN=out/minikube-darwin-amd64
- MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
- MINIKUBE_HOME=/var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current1147855368/001
* Using the hyperkit driver based on user configuration
* The 'hyperkit' driver requires elevated permissions. The following commands will be executed:

                                                
                                                
$ sudo chown root:wheel /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current1147855368/001/.minikube/bin/docker-machine-driver-hyperkit 
$ sudo chmod u+s /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current1147855368/001/.minikube/bin/docker-machine-driver-hyperkit 

                                                
                                                

                                                
                                                
! Unable to update hyperkit driver: [sudo chown root:wheel /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current1147855368/001/.minikube/bin/docker-machine-driver-hyperkit] requires a password, and --interactive=false
* Starting control plane node minikube in cluster minikube
* Download complete!
--- PASS: TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current (4.07s)

                                                
                                    
x
+
TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current (6.04s)

                                                
                                                
=== RUN   TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current
* minikube v1.27.0 on darwin
- MINIKUBE_LOCATION=14995
- KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/kubeconfig
- MINIKUBE_BIN=out/minikube-darwin-amd64
- MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
- MINIKUBE_HOME=/var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current882252237/001
* Using the hyperkit driver based on user configuration
* Downloading driver docker-machine-driver-hyperkit:
* The 'hyperkit' driver requires elevated permissions. The following commands will be executed:

                                                
                                                
$ sudo chown root:wheel /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current882252237/001/.minikube/bin/docker-machine-driver-hyperkit 
$ sudo chmod u+s /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current882252237/001/.minikube/bin/docker-machine-driver-hyperkit 

                                                
                                                

                                                
                                                
! Unable to update hyperkit driver: [sudo chown root:wheel /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current882252237/001/.minikube/bin/docker-machine-driver-hyperkit] requires a password, and --interactive=false
* Starting control plane node minikube in cluster minikube
* Download complete!
--- PASS: TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current (6.04s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Setup (0.65s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Setup
--- PASS: TestStoppedBinaryUpgrade/Setup (0.65s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Upgrade (172.45s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Upgrade
version_upgrade_test.go:190: (dbg) Run:  /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube-v1.6.2.3415394834.exe start -p stopped-upgrade-20220921152137-3535 --memory=2200 --vm-driver=hyperkit 
E0921 15:21:42.627220    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/skaffold-20220921151524-3535/client.crt: no such file or directory

                                                
                                                
=== CONT  TestStoppedBinaryUpgrade/Upgrade
version_upgrade_test.go:190: (dbg) Done: /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube-v1.6.2.3415394834.exe start -p stopped-upgrade-20220921152137-3535 --memory=2200 --vm-driver=hyperkit : (1m47.267090871s)
version_upgrade_test.go:199: (dbg) Run:  /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube-v1.6.2.3415394834.exe -p stopped-upgrade-20220921152137-3535 stop
version_upgrade_test.go:199: (dbg) Done: /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube-v1.6.2.3415394834.exe -p stopped-upgrade-20220921152137-3535 stop: (8.072918772s)
version_upgrade_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 start -p stopped-upgrade-20220921152137-3535 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit 
E0921 15:23:57.563477    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
E0921 15:24:05.991116    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/skaffold-20220921151524-3535/client.crt: no such file or directory
E0921 15:24:14.486551    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory

                                                
                                                
=== CONT  TestStoppedBinaryUpgrade/Upgrade
version_upgrade_test.go:205: (dbg) Done: out/minikube-darwin-amd64 start -p stopped-upgrade-20220921152137-3535 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit : (57.108641s)
--- PASS: TestStoppedBinaryUpgrade/Upgrade (172.45s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/MinikubeLogs (2.23s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/MinikubeLogs
version_upgrade_test.go:213: (dbg) Run:  out/minikube-darwin-amd64 logs -p stopped-upgrade-20220921152137-3535
version_upgrade_test.go:213: (dbg) Done: out/minikube-darwin-amd64 logs -p stopped-upgrade-20220921152137-3535: (2.230981438s)
--- PASS: TestStoppedBinaryUpgrade/MinikubeLogs (2.23s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoK8sWithVersion (0.39s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoK8sWithVersion
no_kubernetes_test.go:83: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-20220921152435-3535 --no-kubernetes --kubernetes-version=1.20 --driver=hyperkit 
no_kubernetes_test.go:83: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p NoKubernetes-20220921152435-3535 --no-kubernetes --kubernetes-version=1.20 --driver=hyperkit : exit status 14 (386.754822ms)

                                                
                                                
-- stdout --
	* [NoKubernetes-20220921152435-3535] minikube v1.27.0 on Darwin 12.6
	  - MINIKUBE_LOCATION=14995
	  - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to MK_USAGE: cannot specify --kubernetes-version with --no-kubernetes,
	to unset a global config run:
	
	$ minikube config unset kubernetes-version

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/StartNoK8sWithVersion (0.39s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithK8s (41.86s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithK8s
no_kubernetes_test.go:95: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-20220921152435-3535 --driver=hyperkit 
E0921 15:24:48.834333    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory

                                                
                                                
=== CONT  TestNoKubernetes/serial/StartWithK8s
no_kubernetes_test.go:95: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-20220921152435-3535 --driver=hyperkit : (41.695521494s)
no_kubernetes_test.go:200: (dbg) Run:  out/minikube-darwin-amd64 -p NoKubernetes-20220921152435-3535 status -o json
--- PASS: TestNoKubernetes/serial/StartWithK8s (41.86s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithStopK8s (7.9s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithStopK8s
no_kubernetes_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-20220921152435-3535 --no-kubernetes --driver=hyperkit 

                                                
                                                
=== CONT  TestNoKubernetes/serial/StartWithStopK8s
no_kubernetes_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-20220921152435-3535 --no-kubernetes --driver=hyperkit : (5.293706637s)
no_kubernetes_test.go:200: (dbg) Run:  out/minikube-darwin-amd64 -p NoKubernetes-20220921152435-3535 status -o json
no_kubernetes_test.go:200: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p NoKubernetes-20220921152435-3535 status -o json: exit status 2 (140.775651ms)

                                                
                                                
-- stdout --
	{"Name":"NoKubernetes-20220921152435-3535","Host":"Running","Kubelet":"Stopped","APIServer":"Stopped","Kubeconfig":"Configured","Worker":false}

                                                
                                                
-- /stdout --
no_kubernetes_test.go:124: (dbg) Run:  out/minikube-darwin-amd64 delete -p NoKubernetes-20220921152435-3535
no_kubernetes_test.go:124: (dbg) Done: out/minikube-darwin-amd64 delete -p NoKubernetes-20220921152435-3535: (2.462118995s)
--- PASS: TestNoKubernetes/serial/StartWithStopK8s (7.90s)

                                                
                                    
x
+
TestPause/serial/Start (54.31s)

                                                
                                                
=== RUN   TestPause/serial/Start
pause_test.go:80: (dbg) Run:  out/minikube-darwin-amd64 start -p pause-20220921152522-3535 --memory=2048 --install-addons=false --wait=all --driver=hyperkit 

                                                
                                                
=== CONT  TestPause/serial/Start
pause_test.go:80: (dbg) Done: out/minikube-darwin-amd64 start -p pause-20220921152522-3535 --memory=2048 --install-addons=false --wait=all --driver=hyperkit : (54.305090292s)
--- PASS: TestPause/serial/Start (54.31s)

                                                
                                    
x
+
TestNoKubernetes/serial/Start (21.9s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Start
no_kubernetes_test.go:136: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-20220921152435-3535 --no-kubernetes --driver=hyperkit 
no_kubernetes_test.go:136: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-20220921152435-3535 --no-kubernetes --driver=hyperkit : (21.899760323s)
--- PASS: TestNoKubernetes/serial/Start (21.90s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunning (0.12s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunning
no_kubernetes_test.go:147: (dbg) Run:  out/minikube-darwin-amd64 ssh -p NoKubernetes-20220921152435-3535 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:147: (dbg) Non-zero exit: out/minikube-darwin-amd64 ssh -p NoKubernetes-20220921152435-3535 "sudo systemctl is-active --quiet service kubelet": exit status 1 (123.641626ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunning (0.12s)

                                                
                                    
x
+
TestNoKubernetes/serial/ProfileList (0.53s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/ProfileList
no_kubernetes_test.go:169: (dbg) Run:  out/minikube-darwin-amd64 profile list
no_kubernetes_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 profile list --output=json
--- PASS: TestNoKubernetes/serial/ProfileList (0.53s)

                                                
                                    
x
+
TestNoKubernetes/serial/Stop (2.21s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Stop
no_kubernetes_test.go:158: (dbg) Run:  out/minikube-darwin-amd64 stop -p NoKubernetes-20220921152435-3535
no_kubernetes_test.go:158: (dbg) Done: out/minikube-darwin-amd64 stop -p NoKubernetes-20220921152435-3535: (2.206737531s)
--- PASS: TestNoKubernetes/serial/Stop (2.21s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoArgs (15.5s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoArgs
no_kubernetes_test.go:191: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-20220921152435-3535 --driver=hyperkit 
no_kubernetes_test.go:191: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-20220921152435-3535 --driver=hyperkit : (15.499522012s)
--- PASS: TestNoKubernetes/serial/StartNoArgs (15.50s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.11s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunningSecond
no_kubernetes_test.go:147: (dbg) Run:  out/minikube-darwin-amd64 ssh -p NoKubernetes-20220921152435-3535 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:147: (dbg) Non-zero exit: out/minikube-darwin-amd64 ssh -p NoKubernetes-20220921152435-3535 "sudo systemctl is-active --quiet service kubelet": exit status 1 (112.689024ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/Start (91.45s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/Start
net_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p false-20220921151637-3535 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=false --driver=hyperkit 

                                                
                                                
=== CONT  TestNetworkPlugins/group/false/Start
net_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p false-20220921151637-3535 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=false --driver=hyperkit : (1m31.452778834s)
--- PASS: TestNetworkPlugins/group/false/Start (91.45s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/KubeletFlags
net_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 ssh -p false-20220921151637-3535 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/false/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/NetCatPod (13.3s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/NetCatPod
net_test.go:138: (dbg) Run:  kubectl --context false-20220921151637-3535 replace --force -f testdata/netcat-deployment.yaml
net_test.go:152: (dbg) TestNetworkPlugins/group/false/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-5788d667bd-7qc7r" [2e36771c-4958-471a-a695-39a23e675f53] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])

                                                
                                                
=== CONT  TestNetworkPlugins/group/false/NetCatPod
helpers_test.go:342: "netcat-5788d667bd-7qc7r" [2e36771c-4958-471a-a695-39a23e675f53] Running
net_test.go:152: (dbg) TestNetworkPlugins/group/false/NetCatPod: app=netcat healthy within 13.005253093s
--- PASS: TestNetworkPlugins/group/false/NetCatPod (13.30s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/Start (54.48s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/Start
net_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p auto-20220921151637-3535 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --driver=hyperkit 

                                                
                                                
=== CONT  TestNetworkPlugins/group/auto/Start
net_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p auto-20220921151637-3535 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --driver=hyperkit : (54.476918059s)
--- PASS: TestNetworkPlugins/group/auto/Start (54.48s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/DNS (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/DNS
net_test.go:169: (dbg) Run:  kubectl --context false-20220921151637-3535 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/false/DNS (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/Localhost (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/Localhost
net_test.go:188: (dbg) Run:  kubectl --context false-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/false/Localhost (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/HairPin (5.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/HairPin
net_test.go:238: (dbg) Run:  kubectl --context false-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
net_test.go:238: (dbg) Non-zero exit: kubectl --context false-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080": exit status 1 (5.122818297s)

                                                
                                                
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
--- PASS: TestNetworkPlugins/group/false/HairPin (5.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/Start (62.38s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/Start
net_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p kindnet-20220921151637-3535 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=kindnet --driver=hyperkit 

                                                
                                                
=== CONT  TestNetworkPlugins/group/kindnet/Start
net_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p kindnet-20220921151637-3535 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=kindnet --driver=hyperkit : (1m2.379867032s)
--- PASS: TestNetworkPlugins/group/kindnet/Start (62.38s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/KubeletFlags
net_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 ssh -p auto-20220921151637-3535 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/auto/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/NetCatPod (13.19s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/NetCatPod
net_test.go:138: (dbg) Run:  kubectl --context auto-20220921151637-3535 replace --force -f testdata/netcat-deployment.yaml
net_test.go:152: (dbg) TestNetworkPlugins/group/auto/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-5788d667bd-bl4zr" [fad98c9d-9e1e-4749-a503-86c59036b3db] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:342: "netcat-5788d667bd-bl4zr" [fad98c9d-9e1e-4749-a503-86c59036b3db] Running
net_test.go:152: (dbg) TestNetworkPlugins/group/auto/NetCatPod: app=netcat healthy within 13.005925637s
--- PASS: TestNetworkPlugins/group/auto/NetCatPod (13.19s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/DNS (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/DNS
net_test.go:169: (dbg) Run:  kubectl --context auto-20220921151637-3535 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/auto/DNS (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/Localhost (0.09s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/Localhost
net_test.go:188: (dbg) Run:  kubectl --context auto-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/auto/Localhost (0.09s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/HairPin (5.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/HairPin
net_test.go:238: (dbg) Run:  kubectl --context auto-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
net_test.go:238: (dbg) Non-zero exit: kubectl --context auto-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080": exit status 1 (5.104257761s)

                                                
                                                
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
--- PASS: TestNetworkPlugins/group/auto/HairPin (5.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/Start (93.01s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/Start
net_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p flannel-20220921151637-3535 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=flannel --driver=hyperkit 

                                                
                                                
=== CONT  TestNetworkPlugins/group/flannel/Start
net_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p flannel-20220921151637-3535 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=flannel --driver=hyperkit : (1m33.012398541s)
--- PASS: TestNetworkPlugins/group/flannel/Start (93.01s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/ControllerPod (5.01s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/ControllerPod
net_test.go:109: (dbg) TestNetworkPlugins/group/kindnet/ControllerPod: waiting 10m0s for pods matching "app=kindnet" in namespace "kube-system" ...
helpers_test.go:342: "kindnet-lntt9" [c5477e2c-792f-462c-9ac4-4da6acfdf4e7] Running
net_test.go:109: (dbg) TestNetworkPlugins/group/kindnet/ControllerPod: app=kindnet healthy within 5.009888885s
--- PASS: TestNetworkPlugins/group/kindnet/ControllerPod (5.01s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/KubeletFlags
net_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 ssh -p kindnet-20220921151637-3535 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/kindnet/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/NetCatPod (14.19s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/NetCatPod
net_test.go:138: (dbg) Run:  kubectl --context kindnet-20220921151637-3535 replace --force -f testdata/netcat-deployment.yaml
net_test.go:152: (dbg) TestNetworkPlugins/group/kindnet/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-5788d667bd-7hqcn" [a8f70896-9da5-4fb7-8824-194743308d1c] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
E0921 15:29:14.486909    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
helpers_test.go:342: "netcat-5788d667bd-7hqcn" [a8f70896-9da5-4fb7-8824-194743308d1c] Running
net_test.go:152: (dbg) TestNetworkPlugins/group/kindnet/NetCatPod: app=netcat healthy within 14.00542939s
--- PASS: TestNetworkPlugins/group/kindnet/NetCatPod (14.19s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/DNS (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/DNS
net_test.go:169: (dbg) Run:  kubectl --context kindnet-20220921151637-3535 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/kindnet/DNS (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/Localhost
net_test.go:188: (dbg) Run:  kubectl --context kindnet-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/kindnet/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/HairPin (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/HairPin
net_test.go:238: (dbg) Run:  kubectl --context kindnet-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/kindnet/HairPin (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/Start (56.65s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/Start
net_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p enable-default-cni-20220921151637-3535 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --enable-default-cni=true --driver=hyperkit 
E0921 15:29:31.926539    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
E0921 15:29:48.834682    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
net_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p enable-default-cni-20220921151637-3535 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --enable-default-cni=true --driver=hyperkit : (56.651560912s)
--- PASS: TestNetworkPlugins/group/enable-default-cni/Start (56.65s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/KubeletFlags (0.21s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/KubeletFlags
net_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 ssh -p enable-default-cni-20220921151637-3535 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/enable-default-cni/KubeletFlags (0.21s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/NetCatPod (13.19s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/NetCatPod
net_test.go:138: (dbg) Run:  kubectl --context enable-default-cni-20220921151637-3535 replace --force -f testdata/netcat-deployment.yaml
net_test.go:152: (dbg) TestNetworkPlugins/group/enable-default-cni/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-5788d667bd-525gx" [71dca6f4-76c0-4392-b0d9-da169d92658e] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])

                                                
                                                
=== CONT  TestNetworkPlugins/group/enable-default-cni/NetCatPod
helpers_test.go:342: "netcat-5788d667bd-525gx" [71dca6f4-76c0-4392-b0d9-da169d92658e] Running

                                                
                                                
=== CONT  TestNetworkPlugins/group/enable-default-cni/NetCatPod
net_test.go:152: (dbg) TestNetworkPlugins/group/enable-default-cni/NetCatPod: app=netcat healthy within 13.005016127s
--- PASS: TestNetworkPlugins/group/enable-default-cni/NetCatPod (13.19s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/ControllerPod (5.01s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/ControllerPod
net_test.go:109: (dbg) TestNetworkPlugins/group/flannel/ControllerPod: waiting 10m0s for pods matching "app=flannel" in namespace "kube-system" ...
helpers_test.go:342: "kube-flannel-ds-amd64-ks6zj" [12e6421b-4ee5-41bf-9fc9-e7add9e9e4bb] Running

                                                
                                                
=== CONT  TestNetworkPlugins/group/flannel/ControllerPod
net_test.go:109: (dbg) TestNetworkPlugins/group/flannel/ControllerPod: app=flannel healthy within 5.011637821s
--- PASS: TestNetworkPlugins/group/flannel/ControllerPod (5.01s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/KubeletFlags (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/KubeletFlags
net_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 ssh -p flannel-20220921151637-3535 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/flannel/KubeletFlags (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/NetCatPod (13.19s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/NetCatPod
net_test.go:138: (dbg) Run:  kubectl --context flannel-20220921151637-3535 replace --force -f testdata/netcat-deployment.yaml
net_test.go:152: (dbg) TestNetworkPlugins/group/flannel/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-5788d667bd-jxpqh" [f58ce183-10d7-4cbd-b0d4-96c384b81f47] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])

                                                
                                                
=== CONT  TestNetworkPlugins/group/flannel/NetCatPod
helpers_test.go:342: "netcat-5788d667bd-jxpqh" [f58ce183-10d7-4cbd-b0d4-96c384b81f47] Running
net_test.go:152: (dbg) TestNetworkPlugins/group/flannel/NetCatPod: app=netcat healthy within 13.005169538s
--- PASS: TestNetworkPlugins/group/flannel/NetCatPod (13.19s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/DNS (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/DNS
net_test.go:169: (dbg) Run:  kubectl --context enable-default-cni-20220921151637-3535 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/enable-default-cni/DNS (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/Localhost
net_test.go:188: (dbg) Run:  kubectl --context enable-default-cni-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/enable-default-cni/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/HairPin (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/HairPin
net_test.go:238: (dbg) Run:  kubectl --context enable-default-cni-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/enable-default-cni/HairPin (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/Start (53.56s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/Start
net_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p bridge-20220921151637-3535 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=bridge --driver=hyperkit 

                                                
                                                
=== CONT  TestNetworkPlugins/group/bridge/Start
net_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p bridge-20220921151637-3535 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=bridge --driver=hyperkit : (53.555082936s)
--- PASS: TestNetworkPlugins/group/bridge/Start (53.56s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/DNS (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/DNS
net_test.go:169: (dbg) Run:  kubectl --context flannel-20220921151637-3535 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/flannel/DNS (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/Localhost
net_test.go:188: (dbg) Run:  kubectl --context flannel-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/flannel/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/HairPin (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/HairPin
net_test.go:238: (dbg) Run:  kubectl --context flannel-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/flannel/HairPin (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/Start (51.46s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/Start
net_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p kubenet-20220921151637-3535 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --network-plugin=kubenet --driver=hyperkit 
E0921 15:31:22.138875    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/skaffold-20220921151524-3535/client.crt: no such file or directory

                                                
                                                
=== CONT  TestNetworkPlugins/group/kubenet/Start
net_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p kubenet-20220921151637-3535 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --network-plugin=kubenet --driver=hyperkit : (51.456525753s)
--- PASS: TestNetworkPlugins/group/kubenet/Start (51.46s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/KubeletFlags
net_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 ssh -p bridge-20220921151637-3535 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/bridge/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/NetCatPod (13.2s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/NetCatPod
net_test.go:138: (dbg) Run:  kubectl --context bridge-20220921151637-3535 replace --force -f testdata/netcat-deployment.yaml
net_test.go:152: (dbg) TestNetworkPlugins/group/bridge/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-5788d667bd-2x5vj" [aa3c1efa-0923-402c-bbb7-1747801dc46e] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:342: "netcat-5788d667bd-2x5vj" [aa3c1efa-0923-402c-bbb7-1747801dc46e] Running

                                                
                                                
=== CONT  TestNetworkPlugins/group/bridge/NetCatPod
net_test.go:152: (dbg) TestNetworkPlugins/group/bridge/NetCatPod: app=netcat healthy within 13.005262185s
--- PASS: TestNetworkPlugins/group/bridge/NetCatPod (13.20s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/KubeletFlags
net_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 ssh -p kubenet-20220921151637-3535 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/kubenet/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/NetCatPod (14.17s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/NetCatPod
net_test.go:138: (dbg) Run:  kubectl --context kubenet-20220921151637-3535 replace --force -f testdata/netcat-deployment.yaml
net_test.go:152: (dbg) TestNetworkPlugins/group/kubenet/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-5788d667bd-644d7" [aa828ce0-bfd6-4543-b761-6209c1a78346] Pending
helpers_test.go:342: "netcat-5788d667bd-644d7" [aa828ce0-bfd6-4543-b761-6209c1a78346] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])

                                                
                                                
=== CONT  TestNetworkPlugins/group/kubenet/NetCatPod
helpers_test.go:342: "netcat-5788d667bd-644d7" [aa828ce0-bfd6-4543-b761-6209c1a78346] Running
net_test.go:152: (dbg) TestNetworkPlugins/group/kubenet/NetCatPod: app=netcat healthy within 14.004309706s
--- PASS: TestNetworkPlugins/group/kubenet/NetCatPod (14.17s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/DNS (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/DNS
net_test.go:169: (dbg) Run:  kubectl --context bridge-20220921151637-3535 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/bridge/DNS (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/Localhost
net_test.go:188: (dbg) Run:  kubectl --context bridge-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/bridge/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/HairPin (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/HairPin
net_test.go:238: (dbg) Run:  kubectl --context bridge-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/bridge/HairPin (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/Start (309.28s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/Start
net_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p calico-20220921151637-3535 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=calico --driver=hyperkit 

                                                
                                                
=== CONT  TestNetworkPlugins/group/calico/Start
net_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p calico-20220921151637-3535 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=calico --driver=hyperkit : (5m9.275120483s)
--- PASS: TestNetworkPlugins/group/calico/Start (309.28s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/DNS (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/DNS
net_test.go:169: (dbg) Run:  kubectl --context kubenet-20220921151637-3535 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/kubenet/DNS (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/Localhost (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/Localhost
net_test.go:188: (dbg) Run:  kubectl --context kubenet-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/kubenet/Localhost (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium/Start (87.22s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium/Start
net_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p cilium-20220921151637-3535 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=cilium --driver=hyperkit 
E0921 15:33:21.169657    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/client.crt: no such file or directory
E0921 15:33:36.194927    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/auto-20220921151637-3535/client.crt: no such file or directory
E0921 15:33:36.200102    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/auto-20220921151637-3535/client.crt: no such file or directory
E0921 15:33:36.210235    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/auto-20220921151637-3535/client.crt: no such file or directory
E0921 15:33:36.231794    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/auto-20220921151637-3535/client.crt: no such file or directory
E0921 15:33:36.272986    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/auto-20220921151637-3535/client.crt: no such file or directory
E0921 15:33:36.353109    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/auto-20220921151637-3535/client.crt: no such file or directory
E0921 15:33:36.513878    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/auto-20220921151637-3535/client.crt: no such file or directory
E0921 15:33:36.834010    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/auto-20220921151637-3535/client.crt: no such file or directory
E0921 15:33:37.474278    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/auto-20220921151637-3535/client.crt: no such file or directory
E0921 15:33:38.754501    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/auto-20220921151637-3535/client.crt: no such file or directory
E0921 15:33:41.315597    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/auto-20220921151637-3535/client.crt: no such file or directory
E0921 15:33:46.435823    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/auto-20220921151637-3535/client.crt: no such file or directory
E0921 15:33:56.676000    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/auto-20220921151637-3535/client.crt: no such file or directory
E0921 15:34:02.130448    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/client.crt: no such file or directory
E0921 15:34:03.354463    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kindnet-20220921151637-3535/client.crt: no such file or directory
E0921 15:34:03.360861    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kindnet-20220921151637-3535/client.crt: no such file or directory
E0921 15:34:03.372992    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kindnet-20220921151637-3535/client.crt: no such file or directory
E0921 15:34:03.395155    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kindnet-20220921151637-3535/client.crt: no such file or directory
E0921 15:34:03.435683    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kindnet-20220921151637-3535/client.crt: no such file or directory
E0921 15:34:03.517507    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kindnet-20220921151637-3535/client.crt: no such file or directory
E0921 15:34:03.677616    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kindnet-20220921151637-3535/client.crt: no such file or directory
E0921 15:34:03.997695    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kindnet-20220921151637-3535/client.crt: no such file or directory
E0921 15:34:04.639050    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kindnet-20220921151637-3535/client.crt: no such file or directory
E0921 15:34:05.919599    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kindnet-20220921151637-3535/client.crt: no such file or directory
E0921 15:34:08.479726    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kindnet-20220921151637-3535/client.crt: no such file or directory
E0921 15:34:13.601288    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kindnet-20220921151637-3535/client.crt: no such file or directory
E0921 15:34:14.500904    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
E0921 15:34:17.156239    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/auto-20220921151637-3535/client.crt: no such file or directory
E0921 15:34:23.841642    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kindnet-20220921151637-3535/client.crt: no such file or directory
net_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p cilium-20220921151637-3535 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=cilium --driver=hyperkit : (1m27.22029798s)
--- PASS: TestNetworkPlugins/group/cilium/Start (87.22s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium/ControllerPod (5.01s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium/ControllerPod
net_test.go:109: (dbg) TestNetworkPlugins/group/cilium/ControllerPod: waiting 10m0s for pods matching "k8s-app=cilium" in namespace "kube-system" ...
helpers_test.go:342: "cilium-d69rl" [336fabb5-0adb-4f90-bbc8-ded1267ad86b] Running
net_test.go:109: (dbg) TestNetworkPlugins/group/cilium/ControllerPod: k8s-app=cilium healthy within 5.011282418s
--- PASS: TestNetworkPlugins/group/cilium/ControllerPod (5.01s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium/KubeletFlags
net_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 ssh -p cilium-20220921151637-3535 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/cilium/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium/NetCatPod (15.64s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium/NetCatPod
net_test.go:138: (dbg) Run:  kubectl --context cilium-20220921151637-3535 replace --force -f testdata/netcat-deployment.yaml
net_test.go:152: (dbg) TestNetworkPlugins/group/cilium/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-5788d667bd-h7sgj" [6d3b4c70-eefb-457a-b5da-31345dd852fd] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
E0921 15:34:44.321849    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kindnet-20220921151637-3535/client.crt: no such file or directory
helpers_test.go:342: "netcat-5788d667bd-h7sgj" [6d3b4c70-eefb-457a-b5da-31345dd852fd] Running
E0921 15:34:48.848642    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
net_test.go:152: (dbg) TestNetworkPlugins/group/cilium/NetCatPod: app=netcat healthy within 15.005695899s
--- PASS: TestNetworkPlugins/group/cilium/NetCatPod (15.64s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium/DNS (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium/DNS
net_test.go:169: (dbg) Run:  kubectl --context cilium-20220921151637-3535 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/cilium/DNS (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium/Localhost (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium/Localhost
net_test.go:188: (dbg) Run:  kubectl --context cilium-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/cilium/Localhost (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium/HairPin (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium/HairPin
net_test.go:238: (dbg) Run:  kubectl --context cilium-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/cilium/HairPin (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/Start (60.48s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/Start
net_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p custom-flannel-20220921151637-3535 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=testdata/kube-flannel.yaml --driver=hyperkit 
E0921 15:34:58.116765    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/auto-20220921151637-3535/client.crt: no such file or directory
E0921 15:35:24.052089    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/client.crt: no such file or directory
E0921 15:35:25.282330    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kindnet-20220921151637-3535/client.crt: no such file or directory
E0921 15:35:25.349805    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/enable-default-cni-20220921151637-3535/client.crt: no such file or directory
E0921 15:35:25.355059    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/enable-default-cni-20220921151637-3535/client.crt: no such file or directory
E0921 15:35:25.365316    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/enable-default-cni-20220921151637-3535/client.crt: no such file or directory
E0921 15:35:25.387135    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/enable-default-cni-20220921151637-3535/client.crt: no such file or directory
E0921 15:35:25.428198    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/enable-default-cni-20220921151637-3535/client.crt: no such file or directory
E0921 15:35:25.508339    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/enable-default-cni-20220921151637-3535/client.crt: no such file or directory
E0921 15:35:25.668516    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/enable-default-cni-20220921151637-3535/client.crt: no such file or directory
E0921 15:35:25.989439    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/enable-default-cni-20220921151637-3535/client.crt: no such file or directory
E0921 15:35:26.630833    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/enable-default-cni-20220921151637-3535/client.crt: no such file or directory
E0921 15:35:27.911011    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/enable-default-cni-20220921151637-3535/client.crt: no such file or directory
E0921 15:35:30.471324    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/enable-default-cni-20220921151637-3535/client.crt: no such file or directory
E0921 15:35:32.845573    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:35:32.850733    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:35:32.861105    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:35:32.881409    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:35:32.923296    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:35:33.003488    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:35:33.163587    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:35:33.483656    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:35:34.123797    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:35:35.406029    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:35:35.591756    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/enable-default-cni-20220921151637-3535/client.crt: no such file or directory
E0921 15:35:37.966516    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:35:43.086793    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:35:45.832729    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/enable-default-cni-20220921151637-3535/client.crt: no such file or directory
E0921 15:35:53.328207    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/flannel-20220921151637-3535/client.crt: no such file or directory
net_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p custom-flannel-20220921151637-3535 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=testdata/kube-flannel.yaml --driver=hyperkit : (1m0.479093814s)
--- PASS: TestNetworkPlugins/group/custom-flannel/Start (60.48s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/KubeletFlags (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/KubeletFlags
net_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 ssh -p custom-flannel-20220921151637-3535 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/custom-flannel/KubeletFlags (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/NetCatPod (14.23s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/NetCatPod
net_test.go:138: (dbg) Run:  kubectl --context custom-flannel-20220921151637-3535 replace --force -f testdata/netcat-deployment.yaml
net_test.go:152: (dbg) TestNetworkPlugins/group/custom-flannel/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-5788d667bd-rtfdk" [f3d63e36-473a-4981-bca5-f5841dadc46b] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:342: "netcat-5788d667bd-rtfdk" [f3d63e36-473a-4981-bca5-f5841dadc46b] Running
E0921 15:36:06.315132    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/enable-default-cni-20220921151637-3535/client.crt: no such file or directory
net_test.go:152: (dbg) TestNetworkPlugins/group/custom-flannel/NetCatPod: app=netcat healthy within 14.004648601s
--- PASS: TestNetworkPlugins/group/custom-flannel/NetCatPod (14.23s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/DNS (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/DNS
net_test.go:169: (dbg) Run:  kubectl --context custom-flannel-20220921151637-3535 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/custom-flannel/DNS (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/Localhost
net_test.go:188: (dbg) Run:  kubectl --context custom-flannel-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/custom-flannel/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/HairPin (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/HairPin
net_test.go:238: (dbg) Run:  kubectl --context custom-flannel-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/custom-flannel/HairPin (0.11s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/FirstStart (148.29s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p old-k8s-version-20220921153616-3535 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=hyperkit  --kubernetes-version=v1.16.0
E0921 15:36:20.039245    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/auto-20220921151637-3535/client.crt: no such file or directory
E0921 15:36:22.153690    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/skaffold-20220921151524-3535/client.crt: no such file or directory
E0921 15:36:35.908965    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/bridge-20220921151637-3535/client.crt: no such file or directory
E0921 15:36:35.914064    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/bridge-20220921151637-3535/client.crt: no such file or directory
E0921 15:36:35.925710    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/bridge-20220921151637-3535/client.crt: no such file or directory
E0921 15:36:35.947753    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/bridge-20220921151637-3535/client.crt: no such file or directory
E0921 15:36:35.988997    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/bridge-20220921151637-3535/client.crt: no such file or directory
E0921 15:36:36.069417    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/bridge-20220921151637-3535/client.crt: no such file or directory
E0921 15:36:36.230309    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/bridge-20220921151637-3535/client.crt: no such file or directory
E0921 15:36:36.550387    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/bridge-20220921151637-3535/client.crt: no such file or directory
E0921 15:36:37.190668    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/bridge-20220921151637-3535/client.crt: no such file or directory
E0921 15:36:38.471559    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/bridge-20220921151637-3535/client.crt: no such file or directory
E0921 15:36:41.032209    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/bridge-20220921151637-3535/client.crt: no such file or directory
E0921 15:36:46.153961    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/bridge-20220921151637-3535/client.crt: no such file or directory
E0921 15:36:46.726258    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kubenet-20220921151637-3535/client.crt: no such file or directory
E0921 15:36:46.732328    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kubenet-20220921151637-3535/client.crt: no such file or directory
E0921 15:36:46.743265    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kubenet-20220921151637-3535/client.crt: no such file or directory
E0921 15:36:46.763441    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kubenet-20220921151637-3535/client.crt: no such file or directory
E0921 15:36:46.803871    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kubenet-20220921151637-3535/client.crt: no such file or directory
E0921 15:36:46.884327    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kubenet-20220921151637-3535/client.crt: no such file or directory
E0921 15:36:47.046015    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kubenet-20220921151637-3535/client.crt: no such file or directory
E0921 15:36:47.203870    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kindnet-20220921151637-3535/client.crt: no such file or directory
E0921 15:36:47.276099    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/enable-default-cni-20220921151637-3535/client.crt: no such file or directory
E0921 15:36:47.366742    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kubenet-20220921151637-3535/client.crt: no such file or directory
E0921 15:36:48.007237    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kubenet-20220921151637-3535/client.crt: no such file or directory
E0921 15:36:49.289459    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kubenet-20220921151637-3535/client.crt: no such file or directory
E0921 15:36:51.849831    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kubenet-20220921151637-3535/client.crt: no such file or directory
E0921 15:36:54.771561    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:36:56.394442    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/bridge-20220921151637-3535/client.crt: no such file or directory
E0921 15:36:56.970253    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kubenet-20220921151637-3535/client.crt: no such file or directory

                                                
                                                
=== CONT  TestStartStop/group/old-k8s-version/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Done: out/minikube-darwin-amd64 start -p old-k8s-version-20220921153616-3535 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=hyperkit  --kubernetes-version=v1.16.0: (2m28.285088418s)
--- PASS: TestStartStop/group/old-k8s-version/serial/FirstStart (148.29s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/ControllerPod (5.01s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/ControllerPod
net_test.go:109: (dbg) TestNetworkPlugins/group/calico/ControllerPod: waiting 10m0s for pods matching "k8s-app=calico-node" in namespace "kube-system" ...
helpers_test.go:342: "calico-node-spw4t" [26f7aaf0-e0d0-43ff-b394-48a304b5cdf3] Running / Ready:ContainersNotReady (containers with unready status: [calico-node]) / ContainersReady:ContainersNotReady (containers with unready status: [calico-node])
E0921 15:37:07.210688    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kubenet-20220921151637-3535/client.crt: no such file or directory
net_test.go:109: (dbg) TestNetworkPlugins/group/calico/ControllerPod: k8s-app=calico-node healthy within 5.012867736s
--- PASS: TestNetworkPlugins/group/calico/ControllerPod (5.01s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/KubeletFlags
net_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 ssh -p calico-20220921151637-3535 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/calico/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/NetCatPod (14.28s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/NetCatPod
net_test.go:138: (dbg) Run:  kubectl --context calico-20220921151637-3535 replace --force -f testdata/netcat-deployment.yaml
net_test.go:152: (dbg) TestNetworkPlugins/group/calico/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-5788d667bd-nxt24" [e52248f7-8b8b-47c9-84da-62af9c9ed59c] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
E0921 15:37:16.875069    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/bridge-20220921151637-3535/client.crt: no such file or directory
helpers_test.go:342: "netcat-5788d667bd-nxt24" [e52248f7-8b8b-47c9-84da-62af9c9ed59c] Running
net_test.go:152: (dbg) TestNetworkPlugins/group/calico/NetCatPod: app=netcat healthy within 14.005607499s
--- PASS: TestNetworkPlugins/group/calico/NetCatPod (14.28s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/DNS (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/DNS
net_test.go:169: (dbg) Run:  kubectl --context calico-20220921151637-3535 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/calico/DNS (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/Localhost (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/Localhost
net_test.go:188: (dbg) Run:  kubectl --context calico-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/calico/Localhost (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/HairPin (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/HairPin
net_test.go:238: (dbg) Run:  kubectl --context calico-20220921151637-3535 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/calico/HairPin (0.10s)
E0921 15:52:59.007328    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/bridge-20220921151637-3535/client.crt: no such file or directory
E0921 15:53:09.827218    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kubenet-20220921151637-3535/client.crt: no such file or directory

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/FirstStart (63.56s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p no-preload-20220921153727-3535 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.25.2
E0921 15:37:27.693240    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kubenet-20220921151637-3535/client.crt: no such file or directory
E0921 15:37:36.657856    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
E0921 15:37:40.205323    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/client.crt: no such file or directory
E0921 15:37:45.206512    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/skaffold-20220921151524-3535/client.crt: no such file or directory
E0921 15:37:57.835390    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/bridge-20220921151637-3535/client.crt: no such file or directory
E0921 15:38:07.894026    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/client.crt: no such file or directory
E0921 15:38:08.654283    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kubenet-20220921151637-3535/client.crt: no such file or directory
E0921 15:38:09.196638    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/enable-default-cni-20220921151637-3535/client.crt: no such file or directory
E0921 15:38:16.693033    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/flannel-20220921151637-3535/client.crt: no such file or directory
start_stop_delete_test.go:186: (dbg) Done: out/minikube-darwin-amd64 start -p no-preload-20220921153727-3535 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.25.2: (1m3.56220353s)
--- PASS: TestStartStop/group/no-preload/serial/FirstStart (63.56s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/DeployApp (11.25s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context no-preload-20220921153727-3535 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) TestStartStop/group/no-preload/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:342: "busybox" [b38faa07-d190-4927-9155-e3873399ab06] Pending
helpers_test.go:342: "busybox" [b38faa07-d190-4927-9155-e3873399ab06] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
E0921 15:38:36.197581    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/auto-20220921151637-3535/client.crt: no such file or directory
helpers_test.go:342: "busybox" [b38faa07-d190-4927-9155-e3873399ab06] Running
start_stop_delete_test.go:196: (dbg) TestStartStop/group/no-preload/serial/DeployApp: integration-test=busybox healthy within 11.01539912s
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context no-preload-20220921153727-3535 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/no-preload/serial/DeployApp (11.25s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/EnableAddonWhileActive (0.65s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p no-preload-20220921153727-3535 --images=MetricsServer=k8s.gcr.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context no-preload-20220921153727-3535 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/no-preload/serial/EnableAddonWhileActive (0.65s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/Stop (3.23s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 stop -p no-preload-20220921153727-3535 --alsologtostderr -v=3

                                                
                                                
=== CONT  TestStartStop/group/no-preload/serial/Stop
start_stop_delete_test.go:228: (dbg) Done: out/minikube-darwin-amd64 stop -p no-preload-20220921153727-3535 --alsologtostderr -v=3: (3.229681528s)
--- PASS: TestStartStop/group/no-preload/serial/Stop (3.23s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/DeployApp (11.27s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context old-k8s-version-20220921153616-3535 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) TestStartStop/group/old-k8s-version/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:342: "busybox" [ba9f61e1-9951-43a6-8cf8-bb363e40d717] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])

                                                
                                                
=== CONT  TestStartStop/group/old-k8s-version/serial/DeployApp
helpers_test.go:342: "busybox" [ba9f61e1-9951-43a6-8cf8-bb363e40d717] Running
start_stop_delete_test.go:196: (dbg) TestStartStop/group/old-k8s-version/serial/DeployApp: integration-test=busybox healthy within 11.022783781s
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context old-k8s-version-20220921153616-3535 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/old-k8s-version/serial/DeployApp (11.27s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/EnableAddonAfterStop (0.27s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-20220921153727-3535 -n no-preload-20220921153727-3535
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-20220921153727-3535 -n no-preload-20220921153727-3535: exit status 7 (59.94669ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p no-preload-20220921153727-3535 --images=MetricsScraper=k8s.gcr.io/echoserver:1.4
--- PASS: TestStartStop/group/no-preload/serial/EnableAddonAfterStop (0.27s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/SecondStart (316.06s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p no-preload-20220921153727-3535 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.25.2

                                                
                                                
=== CONT  TestStartStop/group/no-preload/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Done: out/minikube-darwin-amd64 start -p no-preload-20220921153727-3535 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.25.2: (5m15.895379406s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-20220921153727-3535 -n no-preload-20220921153727-3535
--- PASS: TestStartStop/group/no-preload/serial/SecondStart (316.06s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive (0.57s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p old-k8s-version-20220921153616-3535 --images=MetricsServer=k8s.gcr.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context old-k8s-version-20220921153616-3535 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive (0.57s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/Stop (2.22s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 stop -p old-k8s-version-20220921153616-3535 --alsologtostderr -v=3
start_stop_delete_test.go:228: (dbg) Done: out/minikube-darwin-amd64 stop -p old-k8s-version-20220921153616-3535 --alsologtostderr -v=3: (2.220008465s)
--- PASS: TestStartStop/group/old-k8s-version/serial/Stop (2.22s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop (0.26s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p old-k8s-version-20220921153616-3535 -n old-k8s-version-20220921153616-3535
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p old-k8s-version-20220921153616-3535 -n old-k8s-version-20220921153616-3535: exit status 7 (58.895527ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p old-k8s-version-20220921153616-3535 --images=MetricsScraper=k8s.gcr.io/echoserver:1.4
--- PASS: TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop (0.26s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/SecondStart (456.43s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p old-k8s-version-20220921153616-3535 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=hyperkit  --kubernetes-version=v1.16.0
E0921 15:39:03.353450    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kindnet-20220921151637-3535/client.crt: no such file or directory
E0921 15:39:03.880655    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/auto-20220921151637-3535/client.crt: no such file or directory
E0921 15:39:14.502116    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
E0921 15:39:19.755878    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/bridge-20220921151637-3535/client.crt: no such file or directory
E0921 15:39:30.576864    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kubenet-20220921151637-3535/client.crt: no such file or directory
E0921 15:39:31.046712    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kindnet-20220921151637-3535/client.crt: no such file or directory
E0921 15:39:31.067031    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/cilium-20220921151637-3535/client.crt: no such file or directory
E0921 15:39:31.073331    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/cilium-20220921151637-3535/client.crt: no such file or directory
E0921 15:39:31.084350    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/cilium-20220921151637-3535/client.crt: no such file or directory
E0921 15:39:31.105200    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/cilium-20220921151637-3535/client.crt: no such file or directory
E0921 15:39:31.147398    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/cilium-20220921151637-3535/client.crt: no such file or directory
E0921 15:39:31.227858    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/cilium-20220921151637-3535/client.crt: no such file or directory
E0921 15:39:31.390012    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/cilium-20220921151637-3535/client.crt: no such file or directory
E0921 15:39:31.710710    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/cilium-20220921151637-3535/client.crt: no such file or directory
E0921 15:39:32.352920    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/cilium-20220921151637-3535/client.crt: no such file or directory
E0921 15:39:33.633591    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/cilium-20220921151637-3535/client.crt: no such file or directory
E0921 15:39:36.193837    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/cilium-20220921151637-3535/client.crt: no such file or directory
E0921 15:39:41.314034    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/cilium-20220921151637-3535/client.crt: no such file or directory
E0921 15:39:48.849415    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
E0921 15:39:51.555592    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/cilium-20220921151637-3535/client.crt: no such file or directory
E0921 15:40:12.037895    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/cilium-20220921151637-3535/client.crt: no such file or directory
E0921 15:40:25.350958    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/enable-default-cni-20220921151637-3535/client.crt: no such file or directory
E0921 15:40:32.848647    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:40:37.579355    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
E0921 15:40:52.999262    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/cilium-20220921151637-3535/client.crt: no such file or directory
E0921 15:40:53.037348    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/enable-default-cni-20220921151637-3535/client.crt: no such file or directory
E0921 15:40:56.356193    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/custom-flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:40:56.362575    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/custom-flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:40:56.374622    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/custom-flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:40:56.395034    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/custom-flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:40:56.437190    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/custom-flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:40:56.518473    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/custom-flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:40:56.678575    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/custom-flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:40:56.999901    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/custom-flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:40:57.640281    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/custom-flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:40:58.920415    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/custom-flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:41:00.533971    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:41:01.481583    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/custom-flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:41:06.603362    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/custom-flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:41:16.844045    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/custom-flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:41:22.153624    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/skaffold-20220921151524-3535/client.crt: no such file or directory
E0921 15:41:35.910508    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/bridge-20220921151637-3535/client.crt: no such file or directory
E0921 15:41:37.324678    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/custom-flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:41:46.725509    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kubenet-20220921151637-3535/client.crt: no such file or directory
E0921 15:42:03.598659    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/bridge-20220921151637-3535/client.crt: no such file or directory
E0921 15:42:03.844322    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/calico-20220921151637-3535/client.crt: no such file or directory
E0921 15:42:03.850182    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/calico-20220921151637-3535/client.crt: no such file or directory
E0921 15:42:03.861698    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/calico-20220921151637-3535/client.crt: no such file or directory
E0921 15:42:03.882058    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/calico-20220921151637-3535/client.crt: no such file or directory
E0921 15:42:03.924260    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/calico-20220921151637-3535/client.crt: no such file or directory
E0921 15:42:04.006014    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/calico-20220921151637-3535/client.crt: no such file or directory
E0921 15:42:04.166708    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/calico-20220921151637-3535/client.crt: no such file or directory
E0921 15:42:04.487178    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/calico-20220921151637-3535/client.crt: no such file or directory
E0921 15:42:05.127515    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/calico-20220921151637-3535/client.crt: no such file or directory
E0921 15:42:06.443294    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/calico-20220921151637-3535/client.crt: no such file or directory
E0921 15:42:09.005554    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/calico-20220921151637-3535/client.crt: no such file or directory
E0921 15:42:14.127988    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/calico-20220921151637-3535/client.crt: no such file or directory
E0921 15:42:14.417683    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kubenet-20220921151637-3535/client.crt: no such file or directory
E0921 15:42:14.919722    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/cilium-20220921151637-3535/client.crt: no such file or directory
E0921 15:42:18.284996    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/custom-flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:42:24.370334    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/calico-20220921151637-3535/client.crt: no such file or directory
E0921 15:42:36.659761    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
E0921 15:42:40.206444    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/client.crt: no such file or directory
E0921 15:42:44.850675    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/calico-20220921151637-3535/client.crt: no such file or directory
E0921 15:43:25.811211    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/calico-20220921151637-3535/client.crt: no such file or directory
E0921 15:43:36.197157    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/auto-20220921151637-3535/client.crt: no such file or directory
E0921 15:43:40.205956    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/custom-flannel-20220921151637-3535/client.crt: no such file or directory

                                                
                                                
=== CONT  TestStartStop/group/old-k8s-version/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Done: out/minikube-darwin-amd64 start -p old-k8s-version-20220921153616-3535 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=hyperkit  --kubernetes-version=v1.16.0: (7m36.271020421s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p old-k8s-version-20220921153616-3535 -n old-k8s-version-20220921153616-3535
--- PASS: TestStartStop/group/old-k8s-version/serial/SecondStart (456.43s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/UserAppExistsAfterStop (8.01s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-54596f475f-42ncx" [f8f59de5-ed22-403f-9ae1-9fcdc51f8b71] Pending / Ready:ContainersNotReady (containers with unready status: [kubernetes-dashboard]) / ContainersReady:ContainersNotReady (containers with unready status: [kubernetes-dashboard])
E0921 15:44:03.354946    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kindnet-20220921151637-3535/client.crt: no such file or directory
helpers_test.go:342: "kubernetes-dashboard-54596f475f-42ncx" [f8f59de5-ed22-403f-9ae1-9fcdc51f8b71] Running
start_stop_delete_test.go:274: (dbg) TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 8.012158989s
--- PASS: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop (8.01s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/AddonExistsAfterStop (5.06s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/no-preload/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-54596f475f-42ncx" [f8f59de5-ed22-403f-9ae1-9fcdc51f8b71] Running
E0921 15:44:14.503200    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
start_stop_delete_test.go:287: (dbg) TestStartStop/group/no-preload/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.005602343s
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context no-preload-20220921153727-3535 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/no-preload/serial/AddonExistsAfterStop (5.06s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/VerifyKubernetesImages (0.16s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-darwin-amd64 ssh -p no-preload-20220921153727-3535 "sudo crictl images -o json"
start_stop_delete_test.go:304: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
--- PASS: TestStartStop/group/no-preload/serial/VerifyKubernetesImages (0.16s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/Pause (1.86s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 pause -p no-preload-20220921153727-3535 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p no-preload-20220921153727-3535 -n no-preload-20220921153727-3535
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p no-preload-20220921153727-3535 -n no-preload-20220921153727-3535: exit status 2 (150.186218ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p no-preload-20220921153727-3535 -n no-preload-20220921153727-3535
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p no-preload-20220921153727-3535 -n no-preload-20220921153727-3535: exit status 2 (150.53855ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 unpause -p no-preload-20220921153727-3535 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p no-preload-20220921153727-3535 -n no-preload-20220921153727-3535
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p no-preload-20220921153727-3535 -n no-preload-20220921153727-3535
--- PASS: TestStartStop/group/no-preload/serial/Pause (1.86s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/FirstStart (63.01s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p embed-certs-20220921154423-3535 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=hyperkit  --kubernetes-version=v1.25.2
E0921 15:44:31.067819    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/cilium-20220921151637-3535/client.crt: no such file or directory
E0921 15:44:47.731983    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/calico-20220921151637-3535/client.crt: no such file or directory
E0921 15:44:48.852674    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
E0921 15:44:58.760701    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/cilium-20220921151637-3535/client.crt: no such file or directory
E0921 15:45:25.352121    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/enable-default-cni-20220921151637-3535/client.crt: no such file or directory
start_stop_delete_test.go:186: (dbg) Done: out/minikube-darwin-amd64 start -p embed-certs-20220921154423-3535 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=hyperkit  --kubernetes-version=v1.25.2: (1m3.00511486s)
--- PASS: TestStartStop/group/embed-certs/serial/FirstStart (63.01s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/DeployApp (12.31s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context embed-certs-20220921154423-3535 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) TestStartStop/group/embed-certs/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:342: "busybox" [4ed29aa4-8e21-4503-b7ad-7e98c41e1df0] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
E0921 15:45:32.849442    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/flannel-20220921151637-3535/client.crt: no such file or directory
helpers_test.go:342: "busybox" [4ed29aa4-8e21-4503-b7ad-7e98c41e1df0] Running
start_stop_delete_test.go:196: (dbg) TestStartStop/group/embed-certs/serial/DeployApp: integration-test=busybox healthy within 12.01938126s
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context embed-certs-20220921154423-3535 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/embed-certs/serial/DeployApp (12.31s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/EnableAddonWhileActive (0.69s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p embed-certs-20220921154423-3535 --images=MetricsServer=k8s.gcr.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context embed-certs-20220921154423-3535 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/embed-certs/serial/EnableAddonWhileActive (0.69s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/Stop (3.28s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 stop -p embed-certs-20220921154423-3535 --alsologtostderr -v=3
start_stop_delete_test.go:228: (dbg) Done: out/minikube-darwin-amd64 stop -p embed-certs-20220921154423-3535 --alsologtostderr -v=3: (3.281665601s)
--- PASS: TestStartStop/group/embed-certs/serial/Stop (3.28s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/EnableAddonAfterStop (0.27s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p embed-certs-20220921154423-3535 -n embed-certs-20220921154423-3535
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p embed-certs-20220921154423-3535 -n embed-certs-20220921154423-3535: exit status 7 (60.138527ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p embed-certs-20220921154423-3535 --images=MetricsScraper=k8s.gcr.io/echoserver:1.4
--- PASS: TestStartStop/group/embed-certs/serial/EnableAddonAfterStop (0.27s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/SecondStart (313.04s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p embed-certs-20220921154423-3535 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=hyperkit  --kubernetes-version=v1.25.2
E0921 15:45:56.358892    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/custom-flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:46:11.943153    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
E0921 15:46:22.156228    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/skaffold-20220921151524-3535/client.crt: no such file or directory
E0921 15:46:24.046714    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/custom-flannel-20220921151637-3535/client.crt: no such file or directory

                                                
                                                
=== CONT  TestStartStop/group/embed-certs/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Done: out/minikube-darwin-amd64 start -p embed-certs-20220921154423-3535 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=hyperkit  --kubernetes-version=v1.25.2: (5m12.872132278s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p embed-certs-20220921154423-3535 -n embed-certs-20220921154423-3535
--- PASS: TestStartStop/group/embed-certs/serial/SecondStart (313.04s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop (5.01s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-6d946b7fb4-pgr57" [91bc6ac7-54b6-4449-b9a3-c9c1d10b702d] Running
E0921 15:46:35.911494    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/bridge-20220921151637-3535/client.crt: no such file or directory
start_stop_delete_test.go:274: (dbg) TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.011072598s
--- PASS: TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop (5.01s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop (5.06s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-6d946b7fb4-pgr57" [91bc6ac7-54b6-4449-b9a3-c9c1d10b702d] Running
start_stop_delete_test.go:287: (dbg) TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.00722568s
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context old-k8s-version-20220921153616-3535 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop (5.06s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages (0.17s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-darwin-amd64 ssh -p old-k8s-version-20220921153616-3535 "sudo crictl images -o json"
start_stop_delete_test.go:304: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
--- PASS: TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages (0.17s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/Pause (1.78s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 pause -p old-k8s-version-20220921153616-3535 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p old-k8s-version-20220921153616-3535 -n old-k8s-version-20220921153616-3535
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p old-k8s-version-20220921153616-3535 -n old-k8s-version-20220921153616-3535: exit status 2 (153.279493ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p old-k8s-version-20220921153616-3535 -n old-k8s-version-20220921153616-3535
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p old-k8s-version-20220921153616-3535 -n old-k8s-version-20220921153616-3535: exit status 2 (152.80243ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 unpause -p old-k8s-version-20220921153616-3535 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p old-k8s-version-20220921153616-3535 -n old-k8s-version-20220921153616-3535
E0921 15:46:46.726387    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kubenet-20220921151637-3535/client.crt: no such file or directory
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p old-k8s-version-20220921153616-3535 -n old-k8s-version-20220921153616-3535
--- PASS: TestStartStop/group/old-k8s-version/serial/Pause (1.78s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-different-port/serial/FirstStart (58.2s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-different-port/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p default-k8s-different-port-20220921154653-3535 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.25.2
E0921 15:47:03.845005    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/calico-20220921151637-3535/client.crt: no such file or directory
E0921 15:47:31.610244    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/calico-20220921151637-3535/client.crt: no such file or directory
E0921 15:47:36.697631    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
E0921 15:47:40.246310    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/client.crt: no such file or directory
start_stop_delete_test.go:186: (dbg) Done: out/minikube-darwin-amd64 start -p default-k8s-different-port-20220921154653-3535 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.25.2: (58.203854855s)
--- PASS: TestStartStop/group/default-k8s-different-port/serial/FirstStart (58.20s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-different-port/serial/DeployApp (12.27s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-different-port/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context default-k8s-different-port-20220921154653-3535 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) TestStartStop/group/default-k8s-different-port/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:342: "busybox" [a00837b9-319f-4c9d-87ec-810290993324] Pending
helpers_test.go:342: "busybox" [a00837b9-319f-4c9d-87ec-810290993324] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:342: "busybox" [a00837b9-319f-4c9d-87ec-810290993324] Running
start_stop_delete_test.go:196: (dbg) TestStartStop/group/default-k8s-different-port/serial/DeployApp: integration-test=busybox healthy within 12.014364405s
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context default-k8s-different-port-20220921154653-3535 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/default-k8s-different-port/serial/DeployApp (12.27s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-different-port/serial/EnableAddonWhileActive (0.63s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-different-port/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p default-k8s-different-port-20220921154653-3535 --images=MetricsServer=k8s.gcr.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context default-k8s-different-port-20220921154653-3535 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/default-k8s-different-port/serial/EnableAddonWhileActive (0.63s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-different-port/serial/Stop (8.24s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-different-port/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 stop -p default-k8s-different-port-20220921154653-3535 --alsologtostderr -v=3
start_stop_delete_test.go:228: (dbg) Done: out/minikube-darwin-amd64 stop -p default-k8s-different-port-20220921154653-3535 --alsologtostderr -v=3: (8.240232715s)
--- PASS: TestStartStop/group/default-k8s-different-port/serial/Stop (8.24s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-different-port/serial/EnableAddonAfterStop (0.3s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-different-port/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-different-port-20220921154653-3535 -n default-k8s-different-port-20220921154653-3535
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-different-port-20220921154653-3535 -n default-k8s-different-port-20220921154653-3535: exit status 7 (93.974784ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p default-k8s-different-port-20220921154653-3535 --images=MetricsScraper=k8s.gcr.io/echoserver:1.4
--- PASS: TestStartStop/group/default-k8s-different-port/serial/EnableAddonAfterStop (0.30s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-different-port/serial/SecondStart (311.48s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-different-port/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p default-k8s-different-port-20220921154653-3535 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.25.2
E0921 15:48:30.950696    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/no-preload-20220921153727-3535/client.crt: no such file or directory
E0921 15:48:30.956949    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/no-preload-20220921153727-3535/client.crt: no such file or directory
E0921 15:48:30.969018    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/no-preload-20220921153727-3535/client.crt: no such file or directory
E0921 15:48:30.989180    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/no-preload-20220921153727-3535/client.crt: no such file or directory
E0921 15:48:31.030814    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/no-preload-20220921153727-3535/client.crt: no such file or directory
E0921 15:48:31.111090    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/no-preload-20220921153727-3535/client.crt: no such file or directory
E0921 15:48:31.272912    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/no-preload-20220921153727-3535/client.crt: no such file or directory
E0921 15:48:31.593178    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/no-preload-20220921153727-3535/client.crt: no such file or directory
E0921 15:48:32.235412    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/no-preload-20220921153727-3535/client.crt: no such file or directory
E0921 15:48:33.517416    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/no-preload-20220921153727-3535/client.crt: no such file or directory
E0921 15:48:36.077671    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/no-preload-20220921153727-3535/client.crt: no such file or directory
E0921 15:48:36.241576    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/auto-20220921151637-3535/client.crt: no such file or directory
E0921 15:48:41.198780    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/no-preload-20220921153727-3535/client.crt: no such file or directory
E0921 15:48:44.481490    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/old-k8s-version-20220921153616-3535/client.crt: no such file or directory
E0921 15:48:44.486651    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/old-k8s-version-20220921153616-3535/client.crt: no such file or directory
E0921 15:48:44.498760    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/old-k8s-version-20220921153616-3535/client.crt: no such file or directory
E0921 15:48:44.518908    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/old-k8s-version-20220921153616-3535/client.crt: no such file or directory
E0921 15:48:44.560317    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/old-k8s-version-20220921153616-3535/client.crt: no such file or directory
E0921 15:48:44.642098    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/old-k8s-version-20220921153616-3535/client.crt: no such file or directory
E0921 15:48:44.804251    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/old-k8s-version-20220921153616-3535/client.crt: no such file or directory
E0921 15:48:45.126296    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/old-k8s-version-20220921153616-3535/client.crt: no such file or directory
E0921 15:48:45.766799    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/old-k8s-version-20220921153616-3535/client.crt: no such file or directory
E0921 15:48:47.048852    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/old-k8s-version-20220921153616-3535/client.crt: no such file or directory
E0921 15:48:49.609891    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/old-k8s-version-20220921153616-3535/client.crt: no such file or directory
E0921 15:48:51.440645    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/no-preload-20220921153727-3535/client.crt: no such file or directory
E0921 15:48:54.731946    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/old-k8s-version-20220921153616-3535/client.crt: no such file or directory
E0921 15:48:59.756539    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
E0921 15:49:03.300991    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/client.crt: no such file or directory
E0921 15:49:03.399570    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kindnet-20220921151637-3535/client.crt: no such file or directory
E0921 15:49:04.972832    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/old-k8s-version-20220921153616-3535/client.crt: no such file or directory
E0921 15:49:11.922714    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/no-preload-20220921153727-3535/client.crt: no such file or directory
E0921 15:49:14.548161    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/addons-20220921142649-3535/client.crt: no such file or directory
E0921 15:49:25.453738    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/old-k8s-version-20220921153616-3535/client.crt: no such file or directory
E0921 15:49:31.111888    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/cilium-20220921151637-3535/client.crt: no such file or directory
E0921 15:49:48.895909    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/functional-20220921143144-3535/client.crt: no such file or directory
E0921 15:49:52.883685    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/no-preload-20220921153727-3535/client.crt: no such file or directory
E0921 15:49:59.288055    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/auto-20220921151637-3535/client.crt: no such file or directory
E0921 15:50:06.415400    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/old-k8s-version-20220921153616-3535/client.crt: no such file or directory
E0921 15:50:25.397659    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/enable-default-cni-20220921151637-3535/client.crt: no such file or directory
E0921 15:50:26.453921    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kindnet-20220921151637-3535/client.crt: no such file or directory
E0921 15:50:32.894304    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/flannel-20220921151637-3535/client.crt: no such file or directory

                                                
                                                
=== CONT  TestStartStop/group/default-k8s-different-port/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Done: out/minikube-darwin-amd64 start -p default-k8s-different-port-20220921154653-3535 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.25.2: (5m11.285422641s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-different-port-20220921154653-3535 -n default-k8s-different-port-20220921154653-3535
--- PASS: TestStartStop/group/default-k8s-different-port/serial/SecondStart (311.48s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop (12.01s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-54596f475f-w8sqc" [f5b35ccf-af96-400b-bd0c-69f8810ef50c] Pending / Ready:ContainersNotReady (containers with unready status: [kubernetes-dashboard]) / ContainersReady:ContainersNotReady (containers with unready status: [kubernetes-dashboard])
E0921 15:50:56.403920    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/custom-flannel-20220921151637-3535/client.crt: no such file or directory
helpers_test.go:342: "kubernetes-dashboard-54596f475f-w8sqc" [f5b35ccf-af96-400b-bd0c-69f8810ef50c] Running
start_stop_delete_test.go:274: (dbg) TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 12.011970417s
--- PASS: TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop (12.01s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/AddonExistsAfterStop (5.06s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-54596f475f-w8sqc" [f5b35ccf-af96-400b-bd0c-69f8810ef50c] Running
start_stop_delete_test.go:287: (dbg) TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.00672337s
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context embed-certs-20220921154423-3535 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/embed-certs/serial/AddonExistsAfterStop (5.06s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/VerifyKubernetesImages (0.17s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-darwin-amd64 ssh -p embed-certs-20220921154423-3535 "sudo crictl images -o json"
start_stop_delete_test.go:304: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
--- PASS: TestStartStop/group/embed-certs/serial/VerifyKubernetesImages (0.17s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/Pause (1.84s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 pause -p embed-certs-20220921154423-3535 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p embed-certs-20220921154423-3535 -n embed-certs-20220921154423-3535
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p embed-certs-20220921154423-3535 -n embed-certs-20220921154423-3535: exit status 2 (155.503074ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p embed-certs-20220921154423-3535 -n embed-certs-20220921154423-3535
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p embed-certs-20220921154423-3535 -n embed-certs-20220921154423-3535: exit status 2 (158.551564ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 unpause -p embed-certs-20220921154423-3535 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p embed-certs-20220921154423-3535 -n embed-certs-20220921154423-3535
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p embed-certs-20220921154423-3535 -n embed-certs-20220921154423-3535
--- PASS: TestStartStop/group/embed-certs/serial/Pause (1.84s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/FirstStart (51.29s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p newest-cni-20220921155120-3535 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=192.168.111.111/16 --driver=hyperkit  --kubernetes-version=v1.25.2
E0921 15:51:22.201322    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/skaffold-20220921151524-3535/client.crt: no such file or directory
E0921 15:51:28.337190    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/old-k8s-version-20220921153616-3535/client.crt: no such file or directory
E0921 15:51:35.958211    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/bridge-20220921151637-3535/client.crt: no such file or directory
E0921 15:51:46.773209    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/kubenet-20220921151637-3535/client.crt: no such file or directory
E0921 15:51:48.445299    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/enable-default-cni-20220921151637-3535/client.crt: no such file or directory
E0921 15:51:55.942801    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/flannel-20220921151637-3535/client.crt: no such file or directory
E0921 15:52:03.890978    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/calico-20220921151637-3535/client.crt: no such file or directory
start_stop_delete_test.go:186: (dbg) Done: out/minikube-darwin-amd64 start -p newest-cni-20220921155120-3535 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=192.168.111.111/16 --driver=hyperkit  --kubernetes-version=v1.25.2: (51.28821108s)
--- PASS: TestStartStop/group/newest-cni/serial/FirstStart (51.29s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/DeployApp (0s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/DeployApp
--- PASS: TestStartStop/group/newest-cni/serial/DeployApp (0.00s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/EnableAddonWhileActive (0.77s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p newest-cni-20220921155120-3535 --images=MetricsServer=k8s.gcr.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:211: WARNING: cni mode requires additional setup before pods can schedule :(
--- PASS: TestStartStop/group/newest-cni/serial/EnableAddonWhileActive (0.77s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/Stop (8.24s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 stop -p newest-cni-20220921155120-3535 --alsologtostderr -v=3
start_stop_delete_test.go:228: (dbg) Done: out/minikube-darwin-amd64 stop -p newest-cni-20220921155120-3535 --alsologtostderr -v=3: (8.24220317s)
--- PASS: TestStartStop/group/newest-cni/serial/Stop (8.24s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/EnableAddonAfterStop (0.27s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p newest-cni-20220921155120-3535 -n newest-cni-20220921155120-3535
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p newest-cni-20220921155120-3535 -n newest-cni-20220921155120-3535: exit status 7 (59.961946ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p newest-cni-20220921155120-3535 --images=MetricsScraper=k8s.gcr.io/echoserver:1.4
--- PASS: TestStartStop/group/newest-cni/serial/EnableAddonAfterStop (0.27s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/SecondStart (31.7s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p newest-cni-20220921155120-3535 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=192.168.111.111/16 --driver=hyperkit  --kubernetes-version=v1.25.2
E0921 15:52:36.707870    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/ingress-addon-legacy-20220921143604-3535/client.crt: no such file or directory
E0921 15:52:40.256534    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/false-20220921151637-3535/client.crt: no such file or directory
start_stop_delete_test.go:256: (dbg) Done: out/minikube-darwin-amd64 start -p newest-cni-20220921155120-3535 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=192.168.111.111/16 --driver=hyperkit  --kubernetes-version=v1.25.2: (31.54009497s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p newest-cni-20220921155120-3535 -n newest-cni-20220921155120-3535
--- PASS: TestStartStop/group/newest-cni/serial/SecondStart (31.70s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop (0s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop
start_stop_delete_test.go:273: WARNING: cni mode requires additional setup before pods can schedule :(
--- PASS: TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop (0.00s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/AddonExistsAfterStop (0s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/AddonExistsAfterStop
start_stop_delete_test.go:284: WARNING: cni mode requires additional setup before pods can schedule :(
--- PASS: TestStartStop/group/newest-cni/serial/AddonExistsAfterStop (0.00s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/VerifyKubernetesImages (0.2s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-darwin-amd64 ssh -p newest-cni-20220921155120-3535 "sudo crictl images -o json"
--- PASS: TestStartStop/group/newest-cni/serial/VerifyKubernetesImages (0.20s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/Pause (1.84s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 pause -p newest-cni-20220921155120-3535 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p newest-cni-20220921155120-3535 -n newest-cni-20220921155120-3535
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p newest-cni-20220921155120-3535 -n newest-cni-20220921155120-3535: exit status 2 (151.224389ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p newest-cni-20220921155120-3535 -n newest-cni-20220921155120-3535
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p newest-cni-20220921155120-3535 -n newest-cni-20220921155120-3535: exit status 2 (153.806235ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 unpause -p newest-cni-20220921155120-3535 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p newest-cni-20220921155120-3535 -n newest-cni-20220921155120-3535
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p newest-cni-20220921155120-3535 -n newest-cni-20220921155120-3535
--- PASS: TestStartStop/group/newest-cni/serial/Pause (1.84s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-different-port/serial/UserAppExistsAfterStop (7.01s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-different-port/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/default-k8s-different-port/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-54596f475f-6d2hc" [d1e3f605-602a-46f8-963f-7f5817be01c5] Pending / Ready:ContainersNotReady (containers with unready status: [kubernetes-dashboard]) / ContainersReady:ContainersNotReady (containers with unready status: [kubernetes-dashboard])
helpers_test.go:342: "kubernetes-dashboard-54596f475f-6d2hc" [d1e3f605-602a-46f8-963f-7f5817be01c5] Running
E0921 15:53:30.953269    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/no-preload-20220921153727-3535/client.crt: no such file or directory
start_stop_delete_test.go:274: (dbg) TestStartStop/group/default-k8s-different-port/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 7.012667534s
--- PASS: TestStartStop/group/default-k8s-different-port/serial/UserAppExistsAfterStop (7.01s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-different-port/serial/AddonExistsAfterStop (5.06s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-different-port/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/default-k8s-different-port/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-54596f475f-6d2hc" [d1e3f605-602a-46f8-963f-7f5817be01c5] Running
E0921 15:53:36.247483    3535 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14995-2679-411d4579fd248fd57a4259437564c3e08f354535/.minikube/profiles/auto-20220921151637-3535/client.crt: no such file or directory
start_stop_delete_test.go:287: (dbg) TestStartStop/group/default-k8s-different-port/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.006520432s
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context default-k8s-different-port-20220921154653-3535 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/default-k8s-different-port/serial/AddonExistsAfterStop (5.06s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-different-port/serial/VerifyKubernetesImages (0.16s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-different-port/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-darwin-amd64 ssh -p default-k8s-different-port-20220921154653-3535 "sudo crictl images -o json"
start_stop_delete_test.go:304: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
--- PASS: TestStartStop/group/default-k8s-different-port/serial/VerifyKubernetesImages (0.16s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-different-port/serial/Pause (1.81s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-different-port/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 pause -p default-k8s-different-port-20220921154653-3535 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p default-k8s-different-port-20220921154653-3535 -n default-k8s-different-port-20220921154653-3535
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p default-k8s-different-port-20220921154653-3535 -n default-k8s-different-port-20220921154653-3535: exit status 2 (149.75887ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p default-k8s-different-port-20220921154653-3535 -n default-k8s-different-port-20220921154653-3535
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p default-k8s-different-port-20220921154653-3535 -n default-k8s-different-port-20220921154653-3535: exit status 2 (151.625099ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 unpause -p default-k8s-different-port-20220921154653-3535 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p default-k8s-different-port-20220921154653-3535 -n default-k8s-different-port-20220921154653-3535
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p default-k8s-different-port-20220921154653-3535 -n default-k8s-different-port-20220921154653-3535
--- PASS: TestStartStop/group/default-k8s-different-port/serial/Pause (1.81s)

                                                
                                    

Test skip (16/299)

x
+
TestDownloadOnly/v1.16.0/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/cached-images
aaa_download_only_test.go:121: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.16.0/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.16.0/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/binaries
aaa_download_only_test.go:140: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.16.0/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.25.2/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.25.2/cached-images
aaa_download_only_test.go:121: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.25.2/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.25.2/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.25.2/binaries
aaa_download_only_test.go:140: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.25.2/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnlyKic (0s)

                                                
                                                
=== RUN   TestDownloadOnlyKic
aaa_download_only_test.go:214: skipping, only for docker or podman driver
--- SKIP: TestDownloadOnlyKic (0.00s)

                                                
                                    
x
+
TestAddons/parallel/Olm (0s)

                                                
                                                
=== RUN   TestAddons/parallel/Olm
=== PAUSE TestAddons/parallel/Olm

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Olm
addons_test.go:450: Skipping OLM addon test until https://github.com/operator-framework/operator-lifecycle-manager/issues/2534 is resolved
--- SKIP: TestAddons/parallel/Olm (0.00s)

                                                
                                    
x
+
TestKVMDriverInstallOrUpdate (0s)

                                                
                                                
=== RUN   TestKVMDriverInstallOrUpdate
driver_install_or_update_test.go:41: Skip if not linux.
--- SKIP: TestKVMDriverInstallOrUpdate (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/PodmanEnv (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/PodmanEnv
=== PAUSE TestFunctional/parallel/PodmanEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PodmanEnv
functional_test.go:542: only validate podman env with docker container runtime, currently testing docker
--- SKIP: TestFunctional/parallel/PodmanEnv (0.00s)

                                                
                                    
x
+
TestGvisorAddon (0s)

                                                
                                                
=== RUN   TestGvisorAddon
gvisor_addon_test.go:34: skipping test because --gvisor=false
--- SKIP: TestGvisorAddon (0.00s)

                                                
                                    
x
+
TestKicCustomNetwork (0s)

                                                
                                                
=== RUN   TestKicCustomNetwork
kic_custom_network_test.go:34: only runs with docker driver
--- SKIP: TestKicCustomNetwork (0.00s)

                                                
                                    
x
+
TestKicExistingNetwork (0s)

                                                
                                                
=== RUN   TestKicExistingNetwork
kic_custom_network_test.go:73: only runs with docker driver
--- SKIP: TestKicExistingNetwork (0.00s)

                                                
                                    
x
+
TestKicCustomSubnet (0s)

                                                
                                                
=== RUN   TestKicCustomSubnet
kic_custom_network_test.go:102: only runs with docker/podman driver
--- SKIP: TestKicCustomSubnet (0.00s)

                                                
                                    
x
+
TestScheduledStopWindows (0s)

                                                
                                                
=== RUN   TestScheduledStopWindows
scheduled_stop_test.go:42: test only runs on windows
--- SKIP: TestScheduledStopWindows (0.00s)

                                                
                                    
x
+
TestInsufficientStorage (0s)

                                                
                                                
=== RUN   TestInsufficientStorage
status_test.go:38: only runs with docker driver
--- SKIP: TestInsufficientStorage (0.00s)

                                                
                                    
x
+
TestMissingContainerUpgrade (0s)

                                                
                                                
=== RUN   TestMissingContainerUpgrade
version_upgrade_test.go:291: This test is only for Docker
--- SKIP: TestMissingContainerUpgrade (0.00s)

                                                
                                    
x
+
TestStartStop/group/disable-driver-mounts (0.44s)

                                                
                                                
=== RUN   TestStartStop/group/disable-driver-mounts
=== PAUSE TestStartStop/group/disable-driver-mounts

                                                
                                                

                                                
                                                
=== CONT  TestStartStop/group/disable-driver-mounts
start_stop_delete_test.go:103: skipping TestStartStop/group/disable-driver-mounts - only runs on virtualbox
helpers_test.go:175: Cleaning up "disable-driver-mounts-20220921154652-3535" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p disable-driver-mounts-20220921154652-3535
--- SKIP: TestStartStop/group/disable-driver-mounts (0.44s)

                                                
                                    
Copied to clipboard